15 resultados para Portlet-based application
em Aston University Research Archive
Resumo:
The introduction of agent technology raises several security issues that are beyond conventional security mechanisms capability and considerations, but research in protecting the agent from malicious host attack is evolving. This research proposes two approaches to protecting an agent from being attacked by a malicious host. The first approach consists of an obfuscation algorithm that is able to protect the confidentiality of an agent and make it more difficult for a malicious host to spy on the agent. The algorithm uses multiple polynomial functions with multiple random inputs to convert an agent's critical data to a value that is meaningless to the malicious host. The effectiveness of the obfuscation algorithm is enhanced by addition of noise code. The second approach consists of a mechanism that is able to protect the integrity of the agent using state information, recorded during the agent execution process in a remote host environment, to detect a manipulation attack by a malicious host. Both approaches are implemented using a master-slave agent architecture that operates on a distributed migration pattern. Two sets of experimental test were conducted. The first set of experiments measures the migration and migration+computation overheads of the itinerary and distributed migration patterns. The second set of experiments is used to measure the security overhead of the proposed approaches. The protection of the agent is assessed by analysis of its effectiveness under known attacks. Finally, an agent-based application, known as Secure Flight Finder Agent-based System (SecureFAS) is developed, in order to prove the function of the proposed approaches.
Resumo:
In the UK, Open Learning has been used in industrial training for at least the last decade. Trainers and Open Learning practitioners have been concerned about the quality of the products and services being delivered. The argument put forward in this thesis is that there is ambiguity amongst industrialists over the meanings of `Open Learning' and `Quality in Open Learning'. For clarity, a new definition of Open Learning is proposed which challenges the traditional learner-centred approach favoured by educationalists. It introduces the concept that there are benefits afforded to the trainer/employer/teacher as well as to the learner. This enables a focussed view of what quality in Open Learning really means. Having discussed these issues, a new quantitative method of evaluating Open Learning is proposed. This is based upon an assessment of the degree of compliance with which products meet Parts 1 & 2 of the Open Learning Code of Practice. The vehicle for these research studies has been a commercial contract commissioned by the Training Agency for the Engineering Industry Training Board (EITB) to examine the quality of Open Learning products supplied to the engineering industry. A major part of this research has been the application of the evaluation technique to a range of 67 Open Learning products (in eight subject areas). The findings were that good quality products can be found right across the price range - so can average and poor quality ones. The study also shows quite convincingly that there are good quality products to be found at less than 50. Finally the majority (24 out of 34) of the good quality products were text based.
Resumo:
Systematically investigated the waveguide dispersion characteristics of LPFGs. It has been revealed that the coupled cladding modes resonating in the dispersion-turning-point region are intrinsically sensitive to the external perturbation. Thus, LPFG-based application devices requiring good stability should avoid this region. On the other hand, this mode ultra-sensitive-zone can be explored to realise sensors and tuneable filters of high efficiency.
Resumo:
When applying multivariate analysis techniques in information systems and social science disciplines, such as management information systems (MIS) and marketing, the assumption that the empirical data originate from a single homogeneous population is often unrealistic. When applying a causal modeling approach, such as partial least squares (PLS) path modeling, segmentation is a key issue in coping with the problem of heterogeneity in estimated cause-and-effect relationships. This chapter presents a new PLS path modeling approach which classifies units on the basis of the heterogeneity of the estimates in the inner model. If unobserved heterogeneity significantly affects the estimated path model relationships on the aggregate data level, the methodology will allow homogenous groups of observations to be created that exhibit distinctive path model estimates. The approach will, thus, provide differentiated analytical outcomes that permit more precise interpretations of each segment formed. An application on a large data set in an example of the American customer satisfaction index (ACSI) substantiates the methodology’s effectiveness in evaluating PLS path modeling results.
Resumo:
This chapter discusses the current state of biomass-based combined heat and power (CHP) production in the UK. It presents an overview of the UK's energy policy and targets which are relevant to the deployment of biomass-based CHP and summarises the current state for renewable, biomass and CHP. A number of small-scale biomass-based CHP projects are described while providing some indicative capital costs for combustion, pyrolysis and gasification technologies. For comparison purposes, it presents an overview of the respective situation in Europe and particularly in Sweden, Finland and Denmark. There is also a brief comment about novel CHP technologies in Austria. Finally it draws some conclusions on the potential of small-scale biomass CHP in the UK. © 2011 Woodhead Publishing Limited All rights reserved.
Resumo:
Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.
Resumo:
Investigation of the different approaches used by Expert Systems researchers to solve problems in the domain of Mechanical Design and Expert Systems was carried out. The techniques used for conventional formal logic programming were compared with those used when applying Expert Systems concepts. A literature survey of design processes was also conducted with a view to adopting a suitable model of the design process. A model, comprising a variation on two established ones, was developed and applied to a problem within what are described as class 3 design tasks. The research explored the application of these concepts to Mechanical Engineering Design problems and their implementation on a microcomputer using an Expert System building tool. It was necessary to explore the use of Expert Systems in this manner so as to bridge the gap between their use as a control structure and for detailed analytical design. The former application is well researched into and this thesis discusses the latter. Some Expert System building tools available to the author at the beginning of his work were evaluated specifically for their suitability for Mechanical Engineering design problems. Microsynics was found to be the most suitable on which to implement a design problem because of its simple but powerful Semantic Net Knowledge Representation structure and the ability to use other types of representation schemes. Two major implementations were carried out. The first involved a design program for a Helical compression spring and the second a gearpair system design. Two concepts were proposed in the thesis for the modelling and implementation of design systems involving many equations. The method proposed enables equation manipulation and analysis using a combination of frames, semantic nets and production rules. The use of semantic nets for purposes other than for psychology and natural language interpretation, is quite new and represents one of the major contributions to knowledge by the author. The development of a purpose built shell program for this type of design problems was recommended as an extension of the research. Microsynics may usefully be used as a platform for this development.
Resumo:
Mobile technology has not yet achieved widespread acceptance in the Architectural, Engineering, and Construction (AEC) industry. This paper presents work that is part of an ongoing research project focusing on the development of multimodal mobile applications for use in the AEC industry. This paper focuses specifically on a context-relevant lab-based evaluation of two input modalities – stylus and soft-keyboard v. speech-based input – for use with a mobile data collection application for concrete test technicians. The manner in which the evaluation was conducted as well as the results obtained are discussed in detail.
Resumo:
A hybrid passive-active damping solution with improved system stability margin and enhanced dynamic performance is proposed for high power grid interactive converters. In grid connected active rectifier/inverter application, line side LCL filter improves the high frequency attenuation and makes the converter compatible with the stringent grid power quality regulations. Passive damping though offers a simple and reliable solution but it reduces overall converter efficiency. Active damping solutions do not increase the system losses but can guarantee the stable operation up to a certain speed of dynamic response which is limited by the maximum bandwidth of the current controller. This paper examines this limit and introduces a concept of hybrid passive-active damping solution with improved stability margin and high dynamic performance for line side LCL filter based active rectifier/inverter applications. A detailed design, analysis of the hybrid approach and trade-off between system losses and dynamic performance in grid connected applications are reported. Simulation and experimental results from a 10 kVA prototype demonstrate the effectiveness of the proposed solution. An analytical study on system stability and dynamic response with the variations of various controller and passive filter parameters is presented.
Resumo:
Purpose: This paper extends the use of Radio Frequency Identification (RFID) data for accounting of warehouse costs and services. Time Driven Activity Based Costing (TDABC) methodology is enhanced with the real-time collected RFID data about duration of warehouse activities. This allows warehouse managers to have accurate and instant calculations of costs. The RFID enhanced TDABC (RFID-TDABC) is proposed as a novel application of the RFID technology. Research Approach: Application of RFID-TDABC in a warehouse is implemented on warehouse processes of a case study company. Implementation covers receiving, put-away, order picking, and despatching. Findings and Originality: RFID technology is commonly used for the identification and tracking items. The use of the RFID generated information with the TDABC can be successfully extended to the area of costing. This RFID-TDABC costing model will benefit warehouse managers with accurate and instant calculations of costs. Research Impact: There are still unexplored benefits to RFID technology in its applications in warehousing and the wider supply chain. A multi-disciplinary research approach led to combining RFID technology and TDABC accounting method in order to propose RFID-TDABC. Combining methods and theories from different fields with RFID, may lead researchers to develop new techniques such as RFID-TDABC presented in this paper. Practical Impact: RFID-TDABC concept will be of value to practitioners by showing how warehouse costs can be accurately measured by using this approach. Providing better understanding of incurred costs may result in a further optimisation of warehousing operations, lowering costs of activities, and thus provide competitive pricing to customers. RFID-TDABC can be applied in a wider supply chain.
Resumo:
Despite their generally increasing use, the adoption of mobile shopping applications often differs across purchase contexts. In order to advance our understanding of smartphone-based mobile shopping acceptance, this study integrates and extends existing approaches from technology acceptance literature by examining two previously underexplored aspects. Firstly, the study examines the impact of different mobile and personal benefits (instant connectivity, contextual value and hedonic motivation), customer characteristics (habit) and risk facets (financial, performance, and security risk) as antecedents of mobile shopping acceptance. Secondly, it is assumed that several acceptance drivers differ in relevance subject to the perception of three mobile shopping characteristics (location sensitivity, time criticality, and extent of control), while other drivers are assumed to matter independent of the context. Based on a dataset of 410 smartphone shoppers, empirical results demonstrate that several acceptance predictors are associated with ease of use and usefulness, which in turn affect intentional and behavioral outcomes. Furthermore, the extent to which risks and benefits impact ease of use and usefulness is influenced by the three contextual characteristics. From a managerial perspective, results show which factors to consider in the development of mobile shopping applications and in which different application contexts they matter.