964 resultados para Computations Driven Systems
Resumo:
A self-adaptive system adjusts its configuration to tolerate changes in its operating environment. To date, requirements modeling methodologies for self-adaptive systems have necessitated analysis of all potential system configurations, and the circumstances under which each is to be adopted. We argue that, by explicitly capturing and modelling uncertainty in the operating environment, and by verifying and analysing this model at runtime, it is possible for a system to adapt to tolerate some conditions that were not fully considered at design time. We showcase in this paper our tools and research results. © 2012 IEEE.
Resumo:
Small indigenous manufacturers of electronic equipment are coming under increasingly severe pressure to adopt a strong defensive position against large multinational and Far Eastern companies. A common response to this threat has been for these firms to adopt a 'market driven' business strategy based on quality and customer service, rather than a 'technology led' strategy which uses technical specification and price to compete. To successfully implement this type of strategy there is a need for production systems to be redesigned to suit the new demands of marketing. Increased range and fast response require economy of scope rather t ban economy or scale while the organisation's culture must promote quality and process consciousness. This paper describes the 'Modular Assembly Cascade' concept which addresses these needs by applying the principles of flexible manufacturing (FMS) and just in time (,JlT) to electronics assembly. A methodology for executing the concept is also outlined. This is called DRAMA (Design Houtirw !'or· Adopting Modular Assembly).
Resumo:
Information technology companies are having to broaden their overall strategic view in deference to the premise that it is better to be market-driven than technology-led. Cost and technical performance are no longer the only considerations, as quality and service now demand equal recognition. The production of a high volume single item has given way to that of low volume multiple items, which in turn requires some modification of production systems and brings flexible manufacturing, Just-in-Time production and total quality control into sharper focus for the achievement of corporate objectives.
Resumo:
We argue that, for certain constrained domains, elaborate model transformation technologies-implemented from scratch in general-purpose programming languages-are unnecessary for model-driven engineering; instead, lightweight configuration of commercial off-the-shelf productivity tools suffices. In particular, in the CancerGrid project, we have been developing model-driven techniques for the generation of software tools to support clinical trials. A domain metamodel captures the community's best practice in trial design. A scientist authors a trial protocol, modelling their trial by instantiating the metamodel; customized software artifacts to support trial execution are generated automatically from the scientist's model. The metamodel is expressed as an XML Schema, in such a way that it can be instantiated by completing a form to generate a conformant XML document. The same process works at a second level for trial execution: among the artifacts generated from the protocol are models of the data to be collected, and the clinician conducting the trial instantiates such models in reporting observations-again by completing a form to create a conformant XML document, representing the data gathered during that observation. Simple standard form management tools are all that is needed. Our approach is applicable to a wide variety of information-modelling domains: not just clinical trials, but also electronic public sector computing, customer relationship management, document workflow, and so on. © 2012 Springer-Verlag.
Resumo:
Evaporative pads are frequently used for the cooling of greenhouses. However, a drawback of this method is the consumption of freshwater. In this paper it is shown, both theoretically and through a practical example, that effective evaporative cooling can be achieved using seawater in place of fresh water. The advantages and drawbacks of using seawater are discussed more generally. In climates that are both hot and humid, evaporative systems cannot always provide sufficient cooling, with the result that cultivation often has to be halted during the hottest months of the year. To overcome this, we propose a concept in which a desiccant pad is used to dehumidify the air before it enters the evaporative pad. The desiccant pad is supplied with a hygroscopic liquid that is regenerated by the energy of the sun. The performance of this concept has been modelled and the properties of various liquids have been compared. An attractive option is to obtain the liquid from seawater itself, given that seawater contains hygroscopic salts such as magnesium chloride. Preliminary experiments are reported in which magnesium chloride solution has been regenerated beneath a solar simulator.
Resumo:
Purpose – This paper describes a “work in progress” research project being carried out with a public health care provider in the UK, a large NHS hospital Trust. Enhanced engagement with patients is one of the Trust’s core principles, but it is recognised that much more needs to be done to achieve this, and that ICT systems may be able to provide some support. The project is intended to find ways to better capture and evaluate the “voice of the patient” in order to lead to improvements in health care quality, safety and effectiveness. Design/methodology/approach – We propose to investigate the use of a patient-orientated knowledge management system (KMS) in managing knowledge about and from patients. The study is a mixed methods (quantitative and qualitative) investigation based on traditional action research, intended to answer the following three research questions: (1) How can a KMS be used as a mechanism to capture and evaluate patient experiences to provoke patient service change (2) How can the KMS assist in providing a mechanism for systematising patient engagement? (3) How can patient feedback be used to stimulate improvements in care, quality and safety? Originality/value –This methodology aims to involve patients at all phases of the study from its initial design onwards, thus leading to an understanding of the issues associated with using a KMS to manage knowledge about and for patients that is driven by the patients themselves. Practical implications – The outcomes of the project for the collaborating hospital will be firstly, a system for capturing and evaluating knowledge about and from patients, and then as a consequence, improved outcomes for both the patients and the service provider. More generally, it will produce a set of guidelines for managing patient knowledge in an NHS hospital that have been tested in one case example.
Resumo:
When faced with the task of designing and implementing a new self-aware and self-expressive computing system, researchers and practitioners need a set of guidelines on how to use the concepts and foundations developed in the Engineering Proprioception in Computing Systems (EPiCS) project. This report provides such guidelines on how to design self-aware and self-expressive computing systems in a principled way. We have documented different categories of self-awareness and self-expression level using architectural patterns. We have also documented common architectural primitives, their possible candidate techniques and attributes for architecting self-aware and self-expressive systems. Drawing on the knowledge obtained from the previous investigations, we proposed a pattern driven methodology for engineering self-aware and self-expressive systems to assist in utilising the patterns and primitives during design. The methodology contains detailed guidance to make decisions with respect to the possible design alternatives, providing a systematic way to build self-aware and self-expressive systems. Then, we qualitatively and quantitatively evaluated the methodology using two case studies. The results reveal that our pattern driven methodology covers the main aspects of engineering self-aware and self-expressive systems, and that the resulted systems perform significantly better than the non-self-aware systems.
Resumo:
Different types of ontologies and knowledge or metaknowledge connected to them are considered and analyzed aiming at realization in contemporary information security systems (ISS) and especially the case of intrusion detection systems (IDS) or intrusion prevention systems (IPS). Human-centered methods INCONSISTENCY, FUNNEL, CALEIDOSCOPE and CROSSWORD are algorithmic or data-driven methods based on ontologies. All of them interact on a competitive principle ‘survival of the fittest’. They are controlled by a Synthetic MetaMethod SMM. It is shown that the data analysis frequently needs an act of creation especially if it is applied to knowledge-poor environments. It is shown that human-centered methods are very suitable for resolutions in case, and often they are based on the usage of dynamic ontologies
Resumo:
The concern over the quality of delivering video streaming services in mobile wireless networks is addressed in this work. A framework that enhances the Quality of Experience (QoE) of end users through a quality driven resource allocation scheme is proposed. To play a key role, an objective no-reference quality metric, Pause Intensity (PI), is adopted to derive a resource allocation algorithm for video streaming. The framework is examined in the context of 3GPP Long Term Evolution (LTE) systems. The requirements and structure of the proposed PI-based framework are discussed, and results are compared with existing scheduling methods on fairness, efficiency and correlation (between the required and allocated data rates). Furthermore, it is shown that the proposed framework can produce a trade-off between the three parameters through the QoE-aware resource allocation process.
Resumo:
This paper presents the application of Networks of Evolutionary Processors to Decision Support Systems, precisely Knowledge-Driven DSS. Symbolic information and rule-based behavior in Networks of Evolutionary Processors turn out to be a great tool to obtain decisions based on objects present in the network. The non-deterministic and massive parallel way of operation results in NP-problem solving in linear time. A working NEP example is shown.
Resumo:
The paper has been presented at the 12th International Conference on Applications of Computer Algebra, Varna, Bulgaria, June, 2006
Resumo:
Bayesian algorithms pose a limit to the performance learning algorithms can achieve. Natural selection should guide the evolution of information processing systems towards those limits. What can we learn from this evolution and what properties do the intermediate stages have? While this question is too general to permit any answer, progress can be made by restricting the class of information processing systems under study. We present analytical and numerical results for the evolution of on-line algorithms for learning from examples for neural network classifiers, which might include or not a hidden layer. The analytical results are obtained by solving a variational problem to determine the learning algorithm that leads to maximum generalization ability. Simulations using evolutionary programming, for programs that implement learning algorithms, confirm and expand the results. The principal result is not just that the evolution is towards a Bayesian limit. Indeed it is essentially reached. In addition we find that evolution is driven by the discovery of useful structures or combinations of variables and operators. In different runs the temporal order of the discovery of such combinations is unique. The main result is that combinations that signal the surprise brought by an example arise always before combinations that serve to gauge the performance of the learning algorithm. This latter structures can be used to implement annealing schedules. The temporal ordering can be understood analytically as well by doing the functional optimization in restricted functional spaces. We also show that there is data suggesting that the appearance of these traits also follows the same temporal ordering in biological systems. © 2006 American Institute of Physics.
Resumo:
This research is focused on the optimisation of resource utilisation in wireless mobile networks with the consideration of the users’ experienced quality of video streaming services. The study specifically considers the new generation of mobile communication networks, i.e. 4G-LTE, as the main research context. The background study provides an overview of the main properties of the relevant technologies investigated. These include video streaming protocols and networks, video service quality assessment methods, the infrastructure and related functionalities of LTE, and resource allocation algorithms in mobile communication systems. A mathematical model based on an objective and no-reference quality assessment metric for video streaming, namely Pause Intensity, is developed in this work for the evaluation of the continuity of streaming services. The analytical model is verified by extensive simulation and subjective testing on the joint impairment effects of the pause duration and pause frequency. Various types of the video contents and different levels of the impairments have been used in the process of validation tests. It has been shown that Pause Intensity is closely correlated with the subjective quality measurement in terms of the Mean Opinion Score and this correlation property is content independent. Based on the Pause Intensity metric, an optimised resource allocation approach is proposed for the given user requirements, communication system specifications and network performances. This approach concerns both system efficiency and fairness when establishing appropriate resource allocation algorithms, together with the consideration of the correlation between the required and allocated data rates per user. Pause Intensity plays a key role here, representing the required level of Quality of Experience (QoE) to ensure the best balance between system efficiency and fairness. The 3GPP Long Term Evolution (LTE) system is used as the main application environment where the proposed research framework is examined and the results are compared with existing scheduling methods on the achievable fairness, efficiency and correlation. Adaptive video streaming technologies are also investigated and combined with our initiatives on determining the distribution of QoE performance across the network. The resulting scheduling process is controlled through the prioritization of users by considering their perceived quality for the services received. Meanwhile, a trade-off between fairness and efficiency is maintained through an online adjustment of the scheduler’s parameters. Furthermore, Pause Intensity is applied to act as a regulator to realise the rate adaptation function during the end user’s playback of the adaptive streaming service. The adaptive rates under various channel conditions and the shape of the QoE distribution amongst the users for different scheduling policies have been demonstrated in the context of LTE. Finally, the work for interworking between mobile communication system at the macro-cell level and the different deployments of WiFi technologies throughout the macro-cell is presented. A QoEdriven approach is proposed to analyse the offloading mechanism of the user’s data (e.g. video traffic) while the new rate distribution algorithm reshapes the network capacity across the macrocell. The scheduling policy derived is used to regulate the performance of the resource allocation across the fair-efficient spectrum. The associated offloading mechanism can properly control the number of the users within the coverages of the macro-cell base station and each of the WiFi access points involved. The performance of the non-seamless and user-controlled mobile traffic offloading (through the mobile WiFi devices) has been evaluated and compared with that of the standard operator-controlled WiFi hotspots.
Resumo:
We discuss some main points of computer-assisted proofs based on reliable numerical computations. Such so-called self-validating numerical methods in combination with exact symbolic manipulations result in very powerful mathematical software tools. These tools allow proving mathematical statements (existence of a fixed point, of a solution of an ODE, of a zero of a continuous function, of a global minimum within a given range, etc.) using a digital computer. To validate the assertions of the underlying theorems fast finite precision arithmetic is used. The results are absolutely rigorous. To demonstrate the power of reliable symbolic-numeric computations we investigate in some details the verification of very long periodic orbits of chaotic dynamical systems. The verification is done directly in Maple, e.g. using the Maple Power Tool intpakX or, more efficiently, using the C++ class library C-XSC.
Resumo:
Most current 3D landscape visualisation systems either use bespoke hardware solutions, or offer a limited amount of interaction and detail when used in realtime mode. We are developing a modular, data driven 3D visualisation system that can be readily customised to specific requirements. By utilising the latest software engineering methods and bringing a dynamic data driven approach to geo-spatial data visualisation we will deliver an unparalleled level of customisation in near-photo realistic, realtime 3D landscape visualisation. In this paper we show the system framework and describe how this employs data driven techniques. In particular we discuss how data driven approaches are applied to the spatiotemporal management aspect of the application framework, and describe the advantages these convey. © Springer-Verlag Berlin Heidelberg 2006.