22 resultados para user-driven security adaptation

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This first edition of the workshop Model-driven Software Adaptation (M-ADAPT'07) took place in the Technische Universität Berlin with the International Conference ECOOP'07 in the beautiful and buzzing city of Berlin, on the 30th of July, 2007. The workshop was organized by Gordon Blair, Nelly Bencomo, and Robert France. Participants explored how to develop appropriate model-driven approaches to model, analyze, and validate the volatile properties of the behaviour of adaptive systems and its environments. This report gives an overview of the presentations as well as an account of the fruitful discussions that took place at M-ADAPT'07. © 2008 Springer-Verlag Berlin Heidelberg.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A self-adaptive system adjusts its configuration to tolerate changes in its operating environment. To date, requirements modeling methodologies for self-adaptive systems have necessitated analysis of all potential system configurations, and the circumstances under which each is to be adopted. We argue that, by explicitly capturing and modelling uncertainty in the operating environment, and by verifying and analysing this model at runtime, it is possible for a system to adapt to tolerate some conditions that were not fully considered at design time. We showcase in this paper our tools and research results. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Overlaying maps using a desktop GIS is often the first step of a multivariate spatial analysis. The potential of this operation has increased considerably as data sources and Web services to manipulate them are becoming widely available via the Internet. Standards from the OGC enable such geospatial mashups to be seamless and user driven, involving discovery of thematic data. The user is naturally inclined to look for spatial clusters and correlation of outcomes. Using classical cluster detection scan methods to identify multivariate associations can be problematic in this context, because of a lack of control on or knowledge about background populations. For public health and epidemiological mapping, this limiting factor can be critical but often the focus is on spatial identification of risk factors associated with health or clinical status. Spatial entropy index HSu for the ScankOO analysis of the hypothetical dataset using a vicinity which is fixed by the number of points without distinction between their labels. (The size of the labels is proportional to the inverse of the index) In this article we point out that this association itself can ensure some control on underlying populations, and develop an exploratory scan statistic framework for multivariate associations. Inference using statistical map methodologies can be used to test the clustered associations. The approach is illustrated with a hypothetical data example and an epidemiological study on community MRSA. Scenarios of potential use for online mashups are introduced but full implementation is left for further research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Overlaying maps using a desktop GIS is often the first step of a multivariate spatial analysis. The potential of this operation has increased considerably as data sources an dWeb services to manipulate them are becoming widely available via the Internet. Standards from the OGC enable such geospatial ‘mashups’ to be seamless and user driven, involving discovery of thematic data. The user is naturally inclined to look for spatial clusters and ‘correlation’ of outcomes. Using classical cluster detection scan methods to identify multivariate associations can be problematic in this context, because of a lack of control on or knowledge about background populations. For public health and epidemiological mapping, this limiting factor can be critical but often the focus is on spatial identification of risk factors associated with health or clinical status. In this article we point out that this association itself can ensure some control on underlying populations, and develop an exploratory scan statistic framework for multivariate associations. Inference using statistical map methodologies can be used to test the clustered associations. The approach is illustrated with a hypothetical data example and an epidemiological study on community MRSA. Scenarios of potential use for online mashups are introduced but full implementation is left for further research.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper describes an online survey that was conducted to explore typical Internet users' awareness and knowledge of specific technologies that relate to their security and privacy when using a Web browser to access the Internet. The survey was conducted using an anonymous, online questionnaire. Over a four month period, 237 individuals completed the questionnaire. Respondents were predominately Canadian, with substantial numbers from the United Kingdom and the United States. Important findings include evidence that users have tried to educate themselves regarding their online security and privacy, but with limited success; different interpretations of the term "secure Web site" can lead to very different levels of trust in a site; respondents strongly expressed their skepticism about privacy policies, but nevertheless believe that sites can be trusted to respect their stated policies; and users may confuse browser cookies with other types of data stored locally by browsers, leading to inappropriate conclusions about the risks they present.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This research is focused on the optimisation of resource utilisation in wireless mobile networks with the consideration of the users’ experienced quality of video streaming services. The study specifically considers the new generation of mobile communication networks, i.e. 4G-LTE, as the main research context. The background study provides an overview of the main properties of the relevant technologies investigated. These include video streaming protocols and networks, video service quality assessment methods, the infrastructure and related functionalities of LTE, and resource allocation algorithms in mobile communication systems. A mathematical model based on an objective and no-reference quality assessment metric for video streaming, namely Pause Intensity, is developed in this work for the evaluation of the continuity of streaming services. The analytical model is verified by extensive simulation and subjective testing on the joint impairment effects of the pause duration and pause frequency. Various types of the video contents and different levels of the impairments have been used in the process of validation tests. It has been shown that Pause Intensity is closely correlated with the subjective quality measurement in terms of the Mean Opinion Score and this correlation property is content independent. Based on the Pause Intensity metric, an optimised resource allocation approach is proposed for the given user requirements, communication system specifications and network performances. This approach concerns both system efficiency and fairness when establishing appropriate resource allocation algorithms, together with the consideration of the correlation between the required and allocated data rates per user. Pause Intensity plays a key role here, representing the required level of Quality of Experience (QoE) to ensure the best balance between system efficiency and fairness. The 3GPP Long Term Evolution (LTE) system is used as the main application environment where the proposed research framework is examined and the results are compared with existing scheduling methods on the achievable fairness, efficiency and correlation. Adaptive video streaming technologies are also investigated and combined with our initiatives on determining the distribution of QoE performance across the network. The resulting scheduling process is controlled through the prioritization of users by considering their perceived quality for the services received. Meanwhile, a trade-off between fairness and efficiency is maintained through an online adjustment of the scheduler’s parameters. Furthermore, Pause Intensity is applied to act as a regulator to realise the rate adaptation function during the end user’s playback of the adaptive streaming service. The adaptive rates under various channel conditions and the shape of the QoE distribution amongst the users for different scheduling policies have been demonstrated in the context of LTE. Finally, the work for interworking between mobile communication system at the macro-cell level and the different deployments of WiFi technologies throughout the macro-cell is presented. A QoEdriven approach is proposed to analyse the offloading mechanism of the user’s data (e.g. video traffic) while the new rate distribution algorithm reshapes the network capacity across the macrocell. The scheduling policy derived is used to regulate the performance of the resource allocation across the fair-efficient spectrum. The associated offloading mechanism can properly control the number of the users within the coverages of the macro-cell base station and each of the WiFi access points involved. The performance of the non-seamless and user-controlled mobile traffic offloading (through the mobile WiFi devices) has been evaluated and compared with that of the standard operator-controlled WiFi hotspots.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this work we present a quality driven approach to DASH (Dynamic Adaptive Streaming over HTTP) for segment selection in varying network conditions. Current adaption algorithms focus largely on regulating data rates using network layer parameters by selecting the level of quality on offer that can eliminate buffer underrun without considering picture fidelity. In reality, viewers may accept a level of buffer underrun in order to achieve an improved level of picture fidelity. In this case, the conventional DASH algorithms can cause extreme degradation of the picture fidelity when attempting to eliminate buffer underrun with scarce bandwidth availability. Our work is concerned with a quality-aware rate adaption scheme that maximizes the client's quality of experience in terms of both continuity and fidelity (picture quality). Results show that the scheme proposed can maintain a high level of quality for streaming services, especially at low packet loss rates. It is also shown that by eliminating buffer underrun completely, the PSNR that reflects the picture quality of the video is greatly reduced. Our scheme offers the offset between continuity-based quality and resolution-based quality, which can be used to set threshold values for the level of quality desired by clients with different quality requirements. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research describes the development of a groupware system which adds security services to a Computer Supported Cooperative Work system operating over the Internet. The security services use cryptographic techniques to provide a secure access control service and an information protection service. These security services are implemented as a protection layer for the groupware system. These layers are called External Security Layer (ESL) and Internal Security Layer (ISL) respectively. The security services are sufficiently flexible to allow the groupware system to operate in both synchronous and asynchronous modes. The groupware system developed - known as Secure Software Inspection Groupware (SecureSIG) - provides security for a distributed group performing software inspection. SecureSIG extends previous work on developing flexible software inspection groupware (FlexSIG) Sahibuddin, 1999). The SecureSIG model extends the FlexSIG model, and the prototype system was added to the FlexSIG prototype. The prototype was built by integrating existing software, communication and cryptography tools and technology. Java Cryptography Extension (JCE) and Internet technology were used to build the prototype. To test the suitability and transparency of the system, an evaluation was conducted. A questionnaire was used to assess user acceptability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advent of personal communication systems within the last decade has depended upon the utilization of advanced digital schemes for source and channel coding and for modulation. The inherent digital nature of the communications processing has allowed the convenient incorporation of cryptographic techniques to implement security in these communications systems. There are various security requirements, of both the service provider and the mobile subscriber, which may be provided for in a personal communications system. Such security provisions include the privacy of user data, the authentication of communicating parties, the provision for data integrity, and the provision for both location confidentiality and party anonymity. This thesis is concerned with an investigation of the private-key and public-key cryptographic techniques pertinent to the security requirements of personal communication systems and an analysis of the security provisions of Second-Generation personal communication systems is presented. Particular attention has been paid to the properties of the cryptographic protocols which have been employed in current Second-Generation systems. It has been found that certain security-related protocols implemented in the Second-Generation systems have specific weaknesses. A theoretical evaluation of these protocols has been performed using formal analysis techniques and certain assumptions made during the development of the systems are shown to contribute to the security weaknesses. Various attack scenarios which exploit these protocol weaknesses are presented. The Fiat-Sharmir zero-knowledge cryptosystem is presented as an example of how asymmetric algorithm cryptography may be employed as part of an improved security solution. Various modifications to this cryptosystem have been evaluated and their critical parameters are shown to be capable of being optimized to suit a particular applications. The implementation of such a system using current smart card technology has been evaluated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of increasingly powerful computers, which has enabled the use of windowing software, has also opened the way for the computer study, via simulation, of very complex physical systems. In this study, the main issues related to the implementation of interactive simulations of complex systems are identified and discussed. Most existing simulators are closed in the sense that there is no access to the source code and, even if it were available, adaptation to interaction with other systems would require extensive code re-writing. This work aims to increase the flexibility of such software by developing a set of object-oriented simulation classes, which can be extended, by subclassing, at any level, i.e., at the problem domain, presentation or interaction levels. A strategy, which involves the use of an object-oriented framework, concurrent execution of several simulation modules, use of a networked windowing system and the re-use of existing software written in procedural languages, is proposed. A prototype tool which combines these techniques has been implemented and is presented. It allows the on-line definition of the configuration of the physical system and generates the appropriate graphical user interface. Simulation routines have been developed for the chemical recovery cycle of a paper pulp mill. The application, by creation of new classes, of the prototype to the interactive simulation of this physical system is described. Besides providing visual feedback, the resulting graphical user interface greatly simplifies the interaction with this set of simulation modules. This study shows that considerable benefits can be obtained by application of computer science concepts to the engineering domain, by helping domain experts to tailor interactive tools to suit their needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

University students encounter difficulties with academic English because of its vocabulary, phraseology, and variability, and also because academic English differs in many respects from general English, the language which they have experienced before starting their university studies. Although students have been provided with many dictionaries that contain some helpful information on words used in academic English, these dictionaries remain focused on the uses of words in general English. There is therefore a gap in the dictionary market for a dictionary for university students, and this thesis provides a proposal for such a dictionary (called the Dictionary of Academic English; DOAE) in the form of a model which depicts how the dictionary should be designed, compiled, and offered to students. The model draws on state-of-the-art techniques in lexicography, dictionary-use research, and corpus linguistics. The model demanded the creation of a completely new corpus of academic language (Corpus of Academic Journal Articles; CAJA). The main advantages of the corpus are its large size (83.5 million words) and balance. Having access to a large corpus of academic language was essential for a corpus-driven approach to data analysis. A good corpus balance in terms of domains enabled a detailed domain-labelling of senses, patterns, collocates, etc. in the dictionary database, which was then used to tailor the output according to the needs of different types of student. The model proposes an online dictionary that is designed as an online dictionary from the outset. The proposed dictionary is revolutionary in the way it addresses the needs of different types of student. It presents students with a dynamic dictionary whose contents can be customised according to the user's native language, subject of study, variant spelling preferences, and/or visual preferences (e.g. black and white).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the resistance literature as an underpinning theoretical framework, this chapter analyzes how Web designers through their daily practices, (i) adopt recursive, adaptive, and resisting behavior regarding the inclusion of social cues online and (ii) shape the socio-technical power relationship between designers and other stakeholders. Five vignettes in the form of case studies with expert individual Web designers are used. Findings point out at three types of emerging resistance namely: market driven resistance, ideological resistance, and functional resistance. In addition, a series of propositions are provided linking the various themes. Furthermore, the authors suggest that stratification in Web designers’ type is occurring and that resistance offers a novel lens to analyze the debate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Constructing and executing distributed systems that can adapt to their operating context in order to sustain provided services and the service qualities are complex tasks. Managing adaptation of multiple, interacting services is particularly difficult since these services tend to be distributed across the system, interdependent and sometimes tangled with other services. Furthermore, the exponential growth of the number of potential system configurations derived from the variabilities of each service need to be handled. Current practices of writing low-level reconfiguration scripts as part of the system code to handle run time adaptation are both error prone and time consuming and make adaptive systems difficult to validate and evolve. In this paper, we propose to combine model driven and aspect oriented techniques to better cope with the complexities of adaptive systems construction and execution, and to handle the problem of exponential growth of the number of possible configurations. Combining these techniques allows us to use high level domain abstractions, simplify the representation of variants and limit the problem pertaining to the combinatorial explosion of possible configurations. In our approach we also use models at runtime to generate the adaptation logic by comparing the current configuration of the system to a composed model representing the configuration we want to reach. © 2008 Springer-Verlag Berlin Heidelberg.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Engineering adaptive software is an increasingly complex task. Here, we demonstrate Genie, a tool that supports the modelling, generation, and operation of highly reconfigurable, component-based systems. We showcase how Genie is used in two case-studies: i) the development and operation of an adaptive flood warning system, and ii) a service discovery application. In this context, adaptation is enabled by the Gridkit reflective middleware platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The semantic web vision is one in which rich, ontology-based semantic markup will become widely available. The availability of semantic markup on the web opens the way to novel, sophisticated forms of question answering. AquaLog is a portable question-answering system which takes queries expressed in natural language and an ontology as input, and returns answers drawn from one or more knowledge bases (KBs). We say that AquaLog is portable because the configuration time required to customize the system for a particular ontology is negligible. AquaLog presents an elegant solution in which different strategies are combined together in a novel way. It makes use of the GATE NLP platform, string metric algorithms, WordNet and a novel ontology-based relation similarity service to make sense of user queries with respect to the target KB. Moreover it also includes a learning component, which ensures that the performance of the system improves over the time, in response to the particular community jargon used by end users.