993 resultados para Artistic techniques
Resumo:
This paper presents an empirical investigation of policy-based self-management techniques for parallel applications executing in loosely-coupled environments. The dynamic and heterogeneous nature of these environments is discussed and the special considerations for parallel applications are identified. An adaptive strategy for the run-time deployment of tasks of parallel applications is presented. The strategy is based on embedding numerous policies which are informed by contextual and environmental inputs. The policies govern various aspects of behaviour, enhancing flexibility so that the goals of efficiency and performance are achieved despite high levels of environmental variability. A prototype self-managing parallel application is used as a vehicle to explore the feasibility and benefits of the strategy. In particular, several aspects of stability are investigated. The implementation and behaviour of three policies are discussed and sample results examined.
Resumo:
This paper presents innovative work in the development of policy-based autonomic computing. The core of the work is a powerful and flexible policy-expression language AGILE, which facilitates run-time adaptable policy configuration of autonomic systems. AGILE also serves as an integrating platform for other self-management technologies including signal processing, automated trend analysis and utility functions. Each of these technologies has specific advantages and applicability to different types of dynamic adaptation. The AGILE platform enables seamless interoperability of the different technologies to each perform various aspects of self-management within a single application. The various technologies are implemented as object components. Self-management behaviour is specified using the policy language semantics to bind the various components together as required. Since the policy semantics support run-time re-configuration, the self-management architecture is dynamically composable. Additional benefits include the standardisation of the application programmer interface, terminology and semantics, and only a single point of embedding is required.
Resumo:
In this paper, a method for the integration of several numerical analytical techniques that are used in microsystems design and failure analysis is presented. The analytical techniques are categorized into four groups in the discussion, namely the high-fidelity analytical tools, i.e. finite element (FE) method, the fast analytical tools referring to reduced order modeling (ROM); the optimization tools, and probability based analytical tools. The characteristics of these four tools are investigated. The interactions between the four tools are discussed and a methodology for the coupling of these four tools is offered. This methodology consists of three stages, namely reduced order modeling, deterministic optimization and probabilistic optimization. Using this methodology, a case study for optimization of a solder joint is conducted. It is shown that these analysis techniques have mutual relationship of interaction and complementation. Synthetic application of these techniques can fully utilize the advantages of these techniques and satisfy various design requirements. The case study shows that the coupling method of different tools provided by this paper is effective and efficient and it is highly relevant in the design and reliability analysis of microsystems
Resumo:
Analysis of the generic attacks and countermeasures for block cipher based message authentication code algorithms (MAC) in sensor applications is undertaken; the conclusions are used in the design of two new MAC constructs Quicker Block Chaining MAC1 (QBC-MAC1) and Quicker Block Chaining MAC2 (QBC-MAC2). Using software simulation we show that our new constructs point to improvements in usage of CPU instruction clock cycle and energy requirement when benchmarked against the de facto Cipher Block Chaining MAC (CBC-MAC) based construct used in the TinySec security protocol for wireless sensor networks.
Resumo:
A common problem faced by fire safety engineers in the field of evacuation analysis concerns the optimal design of an arbitrarily complex structure in order to minimise evacuation times. How does the engineer determine the best solution? In this study we introduce the concept of numerical optimisation techniques to address this problem. The study makes user of the buildingEXODUS evacuation model coupled with classical optimisation theory including Design of Experiments (DoE) and Response Surface Models (RSM). We demonstrate the technique using a relatively simple problem of determining the optimal location for a single exit in a square room.
Resumo:
Stereology typically concerns estimation of properties of a geometric structure from plane section information. This paperprovides a brief review of some statistical aspects of this rapidly developing field, with some reference to applications in the earth sciences. After an introductory discussion of the scope of stereology, section 2 briefly mentions results applicable when no assumptions can be made about the stochastic nature of the sampled matrix, statistical considerations then arising solelyfrom the ‘randomness’ of the plane section. The next two sections postulate embedded particles of specific shapes, the particular case of spheres being discussed in some detail. References are made to results for ‘thin slices’ and other prob-ing mechanisms. Randomly located convex particles, of otherwise arbitrary shape, are discussed in section 5 and the review concludes with a specific application of stereological ideas to some data on neolithic mining.
Resumo:
A simulated in situ incubation box has been compared with in situ exposure for 14C production measurements in an estuarine environment. Measurements were made over the course of 14 months, mainly in the Tamar estuary; production rates ranged from less than 1 mg C m−2h−1 to 350 mg C m−2h−1 and there was no significant difference between results from the two methods. In the estuarine waters investigated, the simulated in situ incubator with neutral density filters, used with a Secchi disc to determine sampling depths, gives a satisfactory estimate of in situ primary production.