14 resultados para hierarchical systems
em Indian Institute of Science - Bangalore - Índia
Resumo:
Conformance testing focuses on checking whether an implementation. under test (IUT) behaves according to its specification. Typically, testers are interested it? performing targeted tests that exercise certain features of the IUT This intention is formalized as a test purpose. The tester needs a "strategy" to reach the goal specified by the test purpose. Also, for a particular test case, the strategy should tell the tester whether the IUT has passed, failed. or deviated front the test purpose. In [8] Jeron and Morel show how to compute, for a given finite state machine specification and a test purpose automaton, a complete test graph (CTG) which represents all test strategies. In this paper; we consider the case when the specification is a hierarchical state machine and show how to compute a hierarchical CTG which preserves the hierarchical structure of the specification. We also propose an algorithm for an online test oracle which avoids a space overhead associated with the CTG.
Resumo:
Learning automata arranged in a two-level hierarchy are considered. The automata operate in a stationary random environment and update their action probabilities according to the linear-reward- -penalty algorithm at each level. Unlike some hierarchical systems previously proposed, no information transfer exists from one level to another, and yet the hierarchy possesses good convergence properties. Using weak-convergence concepts it is shown that for large time and small values of parameters in the algorithm, the evolution of the optimal path probability can be represented by a diffusion whose parameters can be computed explicitly.
Resumo:
Systems of learning automata have been studied by various researchers to evolve useful strategies for decision making under uncertainity. Considered in this paper are a class of hierarchical systems of learning automata where the system gets responses from its environment at each level of the hierarchy. A classification of such sequential learning tasks based on the complexity of the learning problem is presented. It is shown that none of the existing algorithms can perform in the most general type of hierarchical problem. An algorithm for learning the globally optimal path in this general setting is presented, and its convergence is established. This algorithm needs information transfer from the lower levels to the higher levels. Using the methodology of estimator algorithms, this model can be generalized to accommodate other kinds of hierarchical learning tasks.
Resumo:
The term acclimation has been used with several connotations in the field of acclimatory physiology. An attempt has been made, in this paper, to define precisely the term “acclimation” for effective modelling of acclimatory processes. Acclimation is defined with respect to a specific variable, as cumulative experience gained by the organism when subjected to a step change in the environment. Experimental observations on a large number of variables in animals exposed to sustained stress, show that after initial deviation from the basal value (defined as “growth”), the variables tend to return to basal levels (defined as “decay”). This forms the basis for modelling biological responses in terms of their growth and decay. Hierarchical systems theory as presented by Mesarovic, Macko & Takahara (1970) facilitates modelling of complex and partially characterized systems. This theory, in conjunction with “growth-decay” analysis of biological variables, is used to model temperature regulating system in animals exposed to cold. This approach appears to be applicable at all levels of biological organization. Regulation of hormonal activity which forms a part of the temperature regulating system, and the relationship of the latter with the “energy” system of the animal of which it forms a part, are also effectively modelled by this approach. It is believed that this systematic approach would eliminate much of the current circular thinking in the area of acclimatory physiology.
Resumo:
This paper presents on overview of the issues in precisely defining, specifying and evaluating the dependability of software, particularly in the context of computer controlled process systems. Dependability is intended to be a generic term embodying various quality factors and is useful for both software and hardware. While the developments in quality assurance and reliability theories have proceeded mostly in independent directions for hardware and software systems, we present here the case for developing a unified framework of dependability—a facet of operational effectiveness of modern technological systems, and develop a hierarchical systems model helpful in clarifying this view. In the second half of the paper, we survey the models and methods available for measuring and improving software reliability. The nature of software “bugs”, the failure history of the software system in the various phases of its lifecycle, the reliability growth in the development phase, estimation of the number of errors remaining in the operational phase, and the complexity of the debugging process have all been considered to varying degrees of detail. We also discuss the notion of software fault-tolerance, methods of achieving the same, and the status of other measures of software dependability such as maintainability, availability and safety.
Resumo:
An approach is presented for hierarchical control of an ammonia reactor, which is a key unit process in a nitrogen fertilizer complex. The aim of the control system is to ensure safe operation of the reactor around the optimal operating point in the face of process variable disturbances and parameter variations. The four different layers perform the functions of regulation, optimization, adaptation, and self-organization. The simulation for this proposed application is conducted on an AD511 hybrid computer in which the AD5 analog processor is used to represent the process and the PDP-11/ 35 digital computer is used for the implementation of control laws. Simulation results relating to the different layers have been presented.
Resumo:
A learning automaton operating in a random environment updates its action probabilities on the basis of the reactions of the environment, so that asymptotically it chooses the optimal action. When the number of actions is large the automaton becomes slow because there are too many updatings to be made at each instant. A hierarchical system of such automata with assured c-optimality is suggested to overcome that problem.The learning algorithm for the hierarchical system turns out to be a simple modification of the absolutely expedient algorithm known in the literature. The parameters of the algorithm at each level in the hierarchy depend only on the parameters and the action probabilities of the previous level. It follows that to minimize the number of updatings per cycle each automaton in the hierarchy need have only two or three actions.
Resumo:
An algorithm is described for developing a hierarchy among a set of elements having certain precedence relations. This algorithm, which is based on tracing a path through the graph, is easily implemented by a computer.
Resumo:
An algorithm is described for developing a hierarchy among a set of elements having certain precedence relations. This algorithm, which is based on tracing a path through the graph, is easily implemented by a computer.
Resumo:
In this paper, the control aspects of a hierarchical organization under the influence of "proportionality" policies are analyzed. Proportionality policies are those that restrict the recruitment to every level of the hierarchy (except the bottom most level or base level) to be in strict proportion to the promotions into that level. Both long term and short term control analysis have been discussed. In long term control the specific roles of the parameters of the system with regard to control of the shape and size of the system have been analyzed and yield suitable control strategies. In short term control, the attainability of a target or goal structure within a specific time from a given initial structure has been analyzed and yields the required recruitment strategies. The theoretical analyses have been illustrated with computational examples and also with real world data.
Resumo:
In this paper, the control aspects of a hierarchical organization under the influence of "proportionality" policies are analyzed. Proportionality policies are those that restrict the recruitment to every level of the hierarchy (except the bottom most level or base level) to be in strict proportion to the promotions into that level. Both long term and short term control analysis have been discussed. In long term control the specific roles of the parameters of the system with regard to control of the shape and size of the system have been analyzed and yield suitable control strategies. In short term control, the attainability of a target or goal structure within a specific time from a given initial structure has been analyzed and yields the required recruitment strategies. The theoretical analyses have been illustrated with computational examples and also with real world data. The control of such proportionality systems is then compared with that of the general systems (which do not follow such policies) with some significant conclusions. The control relations of such proportionality systems are found to be simpler and more practically feasible than those of general Markov systems, which do not have such restrictions. Such proportionality systems thus not only retain and match the flexibility of general Markov systems but also have the added advantage of simpler and more practically feasible controls. The proportionality policies hence act as an alternative and more practicably feasible means of control. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
A simple ball-drop impact tester is developed for studying the dynamic response of hierarchical, complex, small-sized systems and materials. The developed algorithm and set-up have provisions for applying programmable potential difference along the height of a test specimen during an impact loading; this enables us to conduct experiments on various materials and smart structures whose mechanical behavior is sensitive to electric field. The software-hardware system allows not only acquisition of dynamic force-time data at very fast sampling rate (up to 2 x 10(6) samples/s), but also application of a pre-set potential difference (up to +/- 10 V) across a test specimen for a duration determined by feedback from the force-time data. We illustrate the functioning of the set-up by studying the effect of electric field on the energy absorption capability of carbon nanotube foams of 5 x 5 x 1.2 mm(3) size under impact conditions. (C) 2014 AIP Publishing LLC.
Resumo:
Prediction of queue waiting times of jobs submitted to production parallel batch systems is important to provide overall estimates to users and can also help meta-schedulers make scheduling decisions. In this work, we have developed a framework for predicting ranges of queue waiting times for jobs by employing multi-class classification of similar jobs in history. Our hierarchical prediction strategy first predicts the point wait time of a job using dynamic k-Nearest Neighbor (kNN) method. It then performs a multi-class classification using Support Vector Machines (SVMs) among all the classes of the jobs. The probabilities given by the SVM for the class predicted using k-NN and its neighboring classes are used to provide a set of ranges of predicted wait times with probabilities. We have used these predictions and probabilities in a meta-scheduling strategy that distributes jobs to different queues/sites in a multi-queue/grid environment for minimizing wait times of the jobs. Experiments with different production supercomputer job traces show that our prediction strategies can give correct predictions for about 77-87% of the jobs, and also result in about 12% improved accuracy when compared to the next best existing method. Experiments with our meta-scheduling strategy using different production and synthetic job traces for various system sizes, partitioning schemes and different workloads, show that the meta-scheduling strategy gives much improved performance when compared to existing scheduling policies by reducing the overall average queue waiting times of the jobs by about 47%.
Resumo:
We show that a film of a suspension of polymer grafted nanoparticles on a liquid substrate can be employed to create two-dimensional nanostructures with a remarkable variation in the pattern length scales. The presented experiments also reveal the emergence of concentration-dependent bimodal patterns as well as re-entrant behaviour that involves length scales due to dewetting and compositional instabilities. The experimental observations are explained through a gradient dynamics model consisting of coupled evolution equations for the height of the suspension film and the concentration of polymer. Using a Flory-Huggins free energy functional for the polymer solution, we show in a linear stability analysis that the thin film undergoes dewetting and/or compositional instabilities depending on the concentration of the polymer in the solution. We argue that the formation via `hierarchical self-assembly' of various functional nanostructures observed in different systems can be explained as resulting from such an interplay of instabilities.