862 resultados para Deadlock Analysis, Distributed Systems, Concurrent Systems, Formal Languages


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: It remains unclear as to whether or not dental bleaching affects the bond strength of dentin/resin restoration. Purpose: To evaluated the bond strength of adhesive systems to dentin submitted to bleaching with 38% hydrogen peroxide (HP) activated by LED-laser and to assess the adhesive/dentin interfaces by means of SEM. Study design: Sixty fragments of dentin (25 mm(2)) were included and divided into two groups: bleached and unbleached. HP was applied for 20 s and photoactivated for 45 s. Groups were subdivided according to the adhesive systems (n = 10): (1) two-steps conventional system (Adper Single Bond), (2) two-steps self-etching system (Clearfil standard error (SE) Bond), and (3) one-step self-etching system (Prompt L-Pop). The specimens received the Z250 resin and, after 24 h, were submitted to the bond strength test. Additional 30 dentin fragments (n = 5) received the same surface treatments and were prepared for SEM. Data were analyzed by ANOVA and Tukey`s test (alpha = 0.05). Results: There was significant strength reduction in bleached group when compared to unbleached group (P < 0.05). Higher bond strength was observed for Prompt. Single Bond and Clearfil presented the smallest values when used in bleached dentin. SEM analysis of the unbleached specimens revealed long tags and uniform hybrid layer for all adhesives. In bleached dentin, Single Bond provided open tubules and with few tags, Clearfil determined the absence of tags and hybrid layer, and Prompt promoted a regular hybrid layer with some tags. Conclusions: Prompt promoted higher shear bond strength, regardless of the bleaching treatment and allowed the formation of a regular and fine hybrid layer with less deep tags, when compared to Single Bond and Clearfil. Microsc. Res. Tech. 74:244-250, 2011. (C) 2010 Wiley-Liss, Inc.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents a method of formally specifying, refining and verifying concurrent systems which uses the object-oriented state-based specification language Object-Z together with the process algebra CSP. Object-Z provides a convenient way of modelling complex data structures needed to define the component processes of such systems, and CSP enables the concise specification of process interactions. The basis of the integration is a semantics of Object-Z classes identical to that of CSP processes. This allows classes specified in Object-Z to he used directly within the CSP part of the specification. In addition to specification, we also discuss refinement and verification in this model. The common semantic basis enables a unified method of refinement to be used, based upon CSP refinement. To enable state-based techniques to be used fur the Object-Z components of a specification we develop state-based refinement relations which are sound and complete with respect to CSP refinement. In addition, a verification method for static and dynamic properties is presented. The method allows us to verify properties of the CSP system specification in terms of its component Object-Z classes by using the laws of the the CSP operators together with the logic for Object-Z.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We compare the performance of two different low-storage filter diagonalisation (LSFD) strategies in the calculation of complex resonance energies of the HO2, radical. The first is carried out within a complex-symmetric Lanczos subspace representation [H. Zhang, S.C. Smith, Phys. Chem. Chem. Phys. 3 (2001) 2281]. The second involves harmonic inversion of a real autocorrelation function obtained via a damped Chebychev recursion [V.A. Mandelshtam, H.S. Taylor, J. Chem. Phys. 107 (1997) 6756]. We find that while the Chebychev approach has the advantage of utilizing real algebra in the time-consuming process of generating the vector recursion, the Lanczos, method (using complex vectors) requires fewer iterations, especially for low-energy part of the spectrum. The overall efficiency in calculating resonances for these two methods is comparable for this challenging system. (C) 2001 Elsevier Science B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Telehealth programmes are rather similar to humans in the way that they are planned, develop, grow and ultimately die or disappear. To achieve good life expectancy for a telehealth programme there appear to be three major needs: nurturing, which includes the provision of money, ideas, education, training and innovation; experience, which involves an integrated management process, the achievement of long and wide patterns of usage, the development of updated policies and procedures and the involvement of multiple disciplines; success, which involves evidence of outcomes, evaluation and research, and, most important, the sharing of information through scientific and popular press publications, and conferences and collaborations with internal and external groups. The future of telehealth in Australia is at a watershed. There are now a substantial number of programmes, and there has been a large amount of financial and human investment in telehealth around the nation. There is, however, no forum for national leadership, no national association and little support at federal government level.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The development of the new TOGA (titration and off-gas analysis) sensor for the detailed study of biological processes in wastewater treatment systems is outlined. The main innovation of the sensor is the amalgamation of titrimetric and off-gas measurement techniques. The resulting measured signals are: hydrogen ion production rate (HPR), oxygen transfer rate (OTR), nitrogen transfer rate (NTR), and carbon dioxide transfer rate (CTR). While OTR and NTR are applicable to aerobic and anoxic conditions, respectively, HPR and CTR are useful signals under all of the conditions found in biological wastewater treatment systems, namely, aerobic, anoxic and anaerobic. The sensor is therefore a powerful tool for studying the key biological processes under all these conditions. A major benefit from the integration of the titrimetric and off-gas analysis methods is that the acid/base buffering systems, in particular the bicarbonate system, are properly accounted for. Experimental data resulting from the TOGA sensor in aerobic, anoxic, and anaerobic conditions demonstrates the strength of the new sensor. In the aerobic environment, carbon oxidation (using acetate as an example carbon source) and nitrification are studied. Both the carbon and ammonia removal rates measured by the sensor compare very well with those obtained from off-line chemical analysis. Further, the aerobic acetate removal process is examined at a fundamental level using the metabolic pathway and stoichiometry established in the literature, whereby the rate of formation of storage products is identified. Under anoxic conditions, the denitrification process is monitored and, again, the measured rate of nitrogen gas transfer (NTR) matches well with the removal of the oxidised nitrogen compounds (measured chemically). In the anaerobic environment, the enhanced biological phosphorus process was investigated. In this case, the measured sensor signals (HPR and CTR) resulting from acetate uptake were used to determine the ratio of the rates of carbon dioxide production by competing groups of microorganisms, which consequently is a measure of the activity of these organisms. The sensor involves the use of expensive equipment such as a mass spectrometer and requires special gases to operate, thus incurring significant capital and operational costs. This makes the sensor more an advanced laboratory tool than an on-line sensor. (C) 2003 Wiley Periodicals, Inc.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fixed-point roundoff noise in digital implementation of linear systems arises due to overflow, quantization of coefficients and input signals, and arithmetical errors. In uniform white-noise models, the last two types of roundoff errors are regarded as uniformly distributed independent random vectors on cubes of suitable size. For input signal quantization errors, the heuristic model is justified by a quantization theorem, which cannot be directly applied to arithmetical errors due to the complicated input-dependence of errors. The complete uniform white-noise model is shown to be valid in the sense of weak convergence of probabilistic measures as the lattice step tends to zero if the matrices of realization of the system in the state space satisfy certain nonresonance conditions and the finite-dimensional distributions of the input signal are absolutely continuous.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper discusses a document discovery tool based on Conceptual Clustering by Formal Concept Analysis. The program allows users to navigate e-mail using a visual lattice metaphor rather than a tree. It implements a virtual. le structure over e-mail where files and entire directories can appear in multiple positions. The content and shape of the lattice formed by the conceptual ontology can assist in e-mail discovery. The system described provides more flexibility in retrieving stored e-mails than what is normally available in e-mail clients. The paper discusses how conceptual ontologies can leverage traditional document retrieval systems and aid knowledge discovery in document collections.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper is concerned with methods for refinement of specifications written using a combination of Object-Z and CSP. Such a combination has proved to be a suitable vehicle for specifying complex systems which involve state and behaviour, and several proposals exist for integrating these two languages. The basis of the integration in this paper is a semantics of Object-Z classes identical to CSP processes. This allows classes specified in Object-Z to be combined using CSP operators. It has been shown that this semantic model allows state-based refinement relations to be used on the Object-Z components in an integrated Object-Z/CSP specification. However, the current refinement methodology does not allow the structure of a specification to be changed in a refinement, whereas a full methodology would, for example, allow concurrency to be introduced during the development life-cycle. In this paper, we tackle these concerns and discuss refinements of specifications written using Object-Z and CSP where we change the structure of the specification when performing the refinement. In particular, we develop a set of structural simulation rules which allow single components to be refined to more complex specifications involving CSP operators. The soundness of these rules is verified against the common semantic model and they are illustrated via a number of examples.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Software architecture is currently recognized as one of the most critical design steps in Software Engineering. The specification of the overall system structure, on the one hand, and of the interactions patterns between its components, on the other, became a major concern for the working developer. Although a number of formalisms to express behaviour and supply the indispensable calculational power to reason about designs, are available, the task of deriving architectural designs on top of popular component platforms has remained largely informal. This paper introduces a systematic approach to derive, from behavioural specifications written in Cw, the corresponding architectural skeletons in the Microsoft .NET framework in the form of executable code

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Graphical user interfaces (GUIs) are critical components of today's open source software. Given their increased relevance, the correctness and usability of GUIs are becoming essential. This paper describes the latest results in the development of our tool to reverse engineer the GUI layer of interactive computing open source systems. We use static analysis techniques to generate models of the user interface behavior from source code. Models help in graphical user interface inspection by allowing designers to concentrate on its more important aspects. One particular type of model that the tool is able to generate is state machines. The paper shows how graph theory can be useful when applied to these models. A number of metrics and algorithms are used in the analysis of aspects of the user interface's quality. The ultimate goal of the tool is to enable analysis of interactive system through GUIs source code inspection.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The current study focuses on the analysis of pressure surge damping in single pipeline systems generated by a fast change of flow, conditions. A dimensionless form of pressurised transient flow equations was developed. presenting the main advantage of being independent of the system characteristics. In lack of flow velocity profiles. the unsteady friction in turbulent regimes is analysed based on two new empirical corrective-coefficients associated with local and convective acceleration terms. A new, surge damping approach is also presented taking into account the pressure peak time variation. The observed attenuation effect in the pressure wave for high deformable pipe materials can be described by a combination of the non-elastic behaviour of the pipe-wall with steady and unsteady friction effects. Several simulations and experimental tests have been carried out. in order to analyse the dynamic response of single pipelines with different characteristics, such as pipe materials. diameters. thickness. lengths and transient conditions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this abstract is presented an energy management system included in a SCADA system existent in a intelligent home. The system control the home energy resources according to the players definitions (electricity consumption and comfort levels), the electricity prices variation in real time mode and the DR events proposed by the aggregators.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM is a multi-agent electricity market simu-lator to model market players and simulate their operation in the market. Market players are entities with specific characteristics and objectives, making their decisions and interacting with other players. MASCEM pro-vides several dynamic strategies for agents’ behaviour. This paper presents a method that aims to provide market players strategic bidding capabilities, allowing them to obtain the higher possible gains out of the market. This method uses an auxiliary forecasting tool, e.g. an Artificial Neural Net-work, to predict the electricity market prices, and analyses its forecasting error patterns. Through the recognition of such patterns occurrence, the method predicts the expected error for the next forecast, and uses it to adapt the actual forecast. The goal is to approximate the forecast to the real value, reducing the forecasting error.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper proposes a new architecture targeting real-time and reliable Distributed Computer-Controlled Systems (DCCS). This architecture provides a structured approach for the integration of soft and/or hard real-time applications with Commercial O -The-Shelf (COTS) components. The Timely Computing Base model is used as the reference model to deal with the heterogeneity of system components with respect to guaranteeing the timeliness of applications. The reliability and availability requirements of hard real-time applications are guaranteed by a software-based fault-tolerance approach.