833 resultados para CRITICAL SYSTEMS
Resumo:
With the aid of thermodynamics of Gibbs, the expression of the spinodal was derived for the polydisperse polymer-solvent system in the framework of Sanchez-Lacombe Lattice Fluid Theory (SLLFT). For convenience, we considered that a model polydisperse polymer contains three sub-components. According to our calculation, the spinodal depends on both weight-average ((M) over bar (w)) and number-average ((M) over bar (n)) molecular weights of the polydisperse polymer, but the z-average molecular weight ((M) over bar (z)) dependence on the spinodal is invisible. The dependence of free volume on composition, temperature, molecular weight, and its distribution results in the effect of (M) over bar (n) on the spinodal. Moreover, it has been found that the effect of changing (M) over bar (w) on the spinodal is much bigger than that of changing (M) over bar (n) and the extrema of the spinodal increases with the rise of the weight-average molecular weight of the polymer in the solutions with upper critical solution temperature (UCST). However, the effect of polydispersity on the spinodal can be neglected for the polymer with a considerably high weight-average molecular weight. A more simple expression of the spinodal for the polydisperse polymer solution in the framework of SLLFT was also derived under the assumption of upsilon(*)=upsilon(1)(*)=upsilon(2)(*) and (1/r(1)(0))-(1/r(2i)(0))-->(1/r(1)(0)).
Resumo:
The latest buzz phrase to enter the world of design research is “Design Thinking”. But is this anything new and does it really have any practical or theoretical relevance to the design world? Many sceptics believe the term has more to do with business strategy and little to do with the complex process of designing products, services and systems. Moreover, many view the term as misleading and a cheap attempt to piggyback the world of business management onto design. This paper seeks to ask is design thinking anything new? Several authors have explicitly or implicitly articulated the term “Design Thinking” before, such as Peter Rowe’s seminal book “Design Thinking” [1] first published in 1987 and Herbert Simon’s “The Sciences of the Artificial” [2] first published in 1969. In Tim Brown’s “Change by Design” [3], design thinking is thought of as a system of three overlapping spaces rather than a sequence of orderly steps namely inspiration – the problem or opportunity that motivates the search for solutions; ideation – the process of generating, developing and testing ideas; and implementation – the path that leads from the design studio, lab and factory to the market. This paper seeks to examine and critically analyse the tenets of this new design thinking manifesto set against three case studies of modern design practice. As such, the paper will compare design thinking theory with the reality of design in practice.
Resumo:
J. Keppens and Q. Shen. Granularity and disaggregation in compositional modelling with applications to ecological systems. Applied Intelligence, 25(3):269-292, 2006.
Resumo:
Predictability - the ability to foretell that an implementation will not violate a set of specified reliability and timeliness requirements - is a crucial, highly desirable property of responsive embedded systems. This paper overviews a development methodology for responsive systems, which enhances predictability by eliminating potential hazards resulting from physically-unsound specifications. The backbone of our methodology is the Time-constrained Reactive Automaton (TRA) formalism, which adopts a fundamental notion of space and time that restricts expressiveness in a way that allows the specification of only reactive, spontaneous, and causal computation. Using the TRA model, unrealistic systems - possessing properties such as clairvoyance, caprice, in finite capacity, or perfect timing - cannot even be specified. We argue that this "ounce of prevention" at the specification level is likely to spare a lot of time and energy in the development cycle of responsive systems - not to mention the elimination of potential hazards that would have gone, otherwise, unnoticed. The TRA model is presented to system developers through the CLEOPATRA programming language. CLEOPATRA features a C-like imperative syntax for the description of computation, which makes it easier to incorporate in applications already using C. It is event-driven, and thus appropriate for embedded process control applications. It is object-oriented and compositional, thus advocating modularity and reusability. CLEOPATRA is semantically sound; its objects can be transformed, mechanically and unambiguously, into formal TRA automata for verification purposes, which can be pursued using model-checking or theorem proving techniques. Since 1989, an ancestor of CLEOPATRA has been in use as a specification and simulation language for embedded time-critical robotic processes.
Resumo:
We survey several of the research efforts pursued by the iBench and snBench projects in the CS Department at Boston University over the last half dozen years. These activities use ideas and methodologies inspired by recent developments in other parts of computer science -- particularly in formal methods and in the foundations of programming languages -- but now specifically applied to the certification of safety-critical networking systems. This is research jointly led by Azer Bestavros and Assaf Kfoury with the participation of Adam Bradley, Andrei Lapets, and Michael Ocean.
Resumo:
The original solution to the high failure rate of software development projects was the imposition of an engineering approach to software development, with processes aimed at providing a repeatable structure to maintain a consistency in the ‘production process’. Despite these attempts at addressing the crisis in software development, others have argued that the rigid processes of an engineering approach did not provide the solution. The Agile approach to software development strives to change how software is developed. It does this primarily by relying on empowered teams of developers who are trusted to manage the necessary tasks, and who accept that change is a necessary part of a development project. The use of, and interest in, Agile methods in software development projects has expanded greatly, yet this has been predominantly practitioner driven. There is a paucity of scientific research on Agile methods and how they are adopted and managed. This study aims at addressing this paucity by examining the adoption of Agile through a theoretical lens. The lens used in this research is that of double loop learning theory. The behaviours required in an Agile team are the same behaviours required in double loop learning; therefore, a transition to double loop learning is required for a successful Agile adoption. The theory of triple loop learning highlights that power factors (or power mechanisms in this research) can inhibit the attainment of double loop learning. This study identifies the negative behaviours - potential power mechanisms - that can inhibit the double loop learning inherent in an Agile adoption, to determine how the Agile processes and behaviours can create these power mechanisms, and how these power mechanisms impact on double loop learning and the Agile adoption. This is a critical realist study, which acknowledges that the real world is a complex one, hierarchically structured into layers. An a priori framework is created to represent these layers, which are categorised as: the Agile context, the power mechanisms, and double loop learning. The aim of the framework is to explain how the Agile processes and behaviours, through the teams of developers and project managers, can ultimately impact on the double loop learning behaviours required in an Agile adoption. Four case studies provide further refinement to the framework, with changes required due to observations which were often different to what existing literature would have predicted. The study concludes by explaining how the teams of developers, the individual developers, and the project managers, working with the Agile processes and required behaviours, can inhibit the double loop learning required in an Agile adoption. A solution is then proposed to mitigate these negative impacts. Additionally, two new research processes are introduced to add to the Information Systems research toolkit.
Towards a situation-awareness-driven design of operational business intelligence & analytics systems
Resumo:
With the swamping and timeliness of data in the organizational context, the decision maker’s choice of an appropriate decision alternative in a given situation is defied. In particular, operational actors are facing the challenge to meet business-critical decisions in a short time and at high frequency. The construct of Situation Awareness (SA) has been established in cognitive psychology as a valid basis for understanding the behavior and decision making of human beings in complex and dynamic systems. SA gives decision makers the possibility to make informed, time-critical decisions and thereby improve the performance of the respective business process. This research paper leverages SA as starting point for a design science project for Operational Business Intelligence and Analytics systems and suggests a first version of design principles.
Resumo:
Our research follows a design science approach to develop a method that supports the initialization of ES implementation projects – the chartering phase. This project phase is highly relevant for implementation success, but is understudied in IS research. In this paper, we derive design principles for a chartering method based on a systematic review of ES implementation literature and semi-structured expert interviews. Our analysis identifies differences in the importance of certain success factors depending on the system type. The proposed design principles are built on these factors and are linked to chartering key activities. We specifically consider system-type-specific chartering aspects for process-centric Business Intelligence & Analytics (BI&A) systems, which are an emerging class of systems at the intersection of BI&A and business process management. In summary, this paper proposes design principles for a chartering method – considering specifics of process-centric BI&A.
Resumo:
BACKGROUND: Computer simulations are of increasing importance in modeling biological phenomena. Their purpose is to predict behavior and guide future experiments. The aim of this project is to model the early immune response to vaccination by an agent based immune response simulation that incorporates realistic biophysics and intracellular dynamics, and which is sufficiently flexible to accurately model the multi-scale nature and complexity of the immune system, while maintaining the high performance critical to scientific computing. RESULTS: The Multiscale Systems Immunology (MSI) simulation framework is an object-oriented, modular simulation framework written in C++ and Python. The software implements a modular design that allows for flexible configuration of components and initialization of parameters, thus allowing simulations to be run that model processes occurring over different temporal and spatial scales. CONCLUSION: MSI addresses the need for a flexible and high-performing agent based model of the immune system.
Resumo:
© 2015 IOP Publishing Ltd & London Mathematical Society.This is a detailed analysis of invariant measures for one-dimensional dynamical systems with random switching. In particular, we prove the smoothness of the invariant densities away from critical points and describe the asymptotics of the invariant densities at critical points.
Resumo:
Software-based control of life-critical embedded systems has become increasingly complex, and to a large extent has come to determine the safety of the human being. For example, implantable cardiac pacemakers have over 80,000 lines of code which are responsible for maintaining the heart within safe operating limits. As firmware-related recalls accounted for over 41% of the 600,000 devices recalled in the last decade, there is a need for rigorous model-driven design tools to generate verified code from verified software models. To this effect, we have developed the UPP2SF model-translation tool, which facilitates automatic conversion of verified models (in UPPAAL) to models that may be simulated and tested (in Simulink/Stateflow). We describe the translation rules that ensure correct model conversion, applicable to a large class of models. We demonstrate how UPP2SF is used in themodel-driven design of a pacemaker whosemodel is (a) designed and verified in UPPAAL (using timed automata), (b) automatically translated to Stateflow for simulation-based testing, and then (c) automatically generated into modular code for hardware-level integration testing of timing-related errors. In addition, we show how UPP2SF may be used for worst-case execution time estimation early in the design stage. Using UPP2SF, we demonstrate the value of integrated end-to-end modeling, verification, code-generation and testing process for complex software-controlled embedded systems. © 2014 ACM.
Resumo:
Relatives to Planetary Nebulae, such as barium stars or symbiotic systems, can shed light on the connection between Planetary Nebulae and binarity. Because of the observational selection effects against direct spectroscopic detection of binary PNe cores with orbital periods longer than a few dozen days, at present these "awkward relatives" are a critical source of our knowledge about the binary PNe population at longer periods. Below a few examples are discussed, posing constraints on the attempts to model nebula, ejection process in a binary. © 2006 International Astronomical Union.
Resumo:
While E-learning technologies are continuously developing, there are number of emerging issues and challenges that have significant impact on e-learning research and design. These include educational, technological, sociological, and psychological viewpoints. The extant literature points out that a large number of existing E-learning systems have problems with offering reusable, personalized and learner-centric content. While developers are placing emphasis on the technology aspects of e-learning, critical conceptual and pedagogical issues are often ignored. This paper will reports on our research in design and development of personalised e-learning systems and some of the challenges and issues faced.
Resumo:
The diversity gains achievable in the generalised distributed antenna system with cooperative users (GDAS-CU) are considered. A GDAS-CU is comprised of M largely separated access points (APs) at one side of the link, and N geographically closed user terminals (UTs) at the other side. The UTs are collaborating together to enhance the system performance, where an idealised message sharing among the UTs is assumed. First, geometry-based network models are proposed to describe the topology of a GDAS-CU. The mean cross-correlation coefficients of signals received from non-collocated APs and UTs are calculated based on the network topology and the correlation models derived from the empirical data. The analysis is also extendable to more general scenarios where the APs are placed in a clustered form due to the constraints of street layout or building structure. Subsequently, a generalised signal attenuation model derived from several stochastic ray-tracing-based pathloss models is applied to describe the power-decaying pattern in urban built-up areas, where the GDAS-CU may be deployed. Armed with the cross-correlation and pathloss model preliminaries, an intrinsic measure of cooperative diversity obtainable from a GDAS-CU is then derived, which is the number of independent fading channels that can be averaged over to detect symbols. The proposed analytical framework would provide critical insight into the degree of possible performance improvement when combining multiple copies of the received signal in such systems.
Resumo:
This, the second edition, adopts a critical and theoretical perspective on remuneration policy and practices in the UK, from the decline of collective bargaining to the rise of more individualistic systems based on employee performance. It tackles the conceptual issues missing from existing texts in the field of HRM by critically examining the latest academic literature on the topic. [Taken from publisher's product description].