272 resultados para Graph-Based Metrics
em University of Queensland eSpace - Australia
Resumo:
Models and model transformations are the core concepts of OMG's MDA (TM) approach. Within this approach, most models are derived from the MOF and have a graph-based nature. In contrast, most of the current model transformations are specified textually. To enable a graphical specification of model transformation rules, this paper proposes to use triple graph grammars as declarative specification formalism. These triple graph grammars can be specified within the FUJABA tool and we argue that these rules can be more easily specified and they become more understandable and maintainable. To show the practicability of our approach, we present how to generate Tefkat rules from triple graph grammar rules, which helps to integrate triple graph grammars with a state of a art model transformation tool and shows the expressiveness of the concept.
Resumo:
An assessment of the changes in the distribution and extent of mangroves within Moreton Bay, southeast Queensland, Australia, was carried out. Two assessment methods were evaluated: spatial and temporal pattern metrics analysis, and change detection analysis. Currently, about 15,000 ha of mangroves are present in Moreton Bay. These mangroves are important ecosystems, but are subject to disturbance from a number of sources. Over the past 25 years, there has been a loss of more than 3800 ha, as a result of natural losses and mangrove clearing (e.g. for urban and industrial development, agriculture and aquaculture). However, areas of new mangroves have become established over the same time period, offsetting these losses to create a net loss of about 200 ha. These new mangroves have mainly appeared in the southern bay region and the bay islands, particularly on the landward edge of existing mangroves. In addition, spatial patterns and species composition of mangrove patches have changed. The pattern metrics analysis provided an overview of mangrove distribution and change in the form of single metric values, while the change detection analysis gave a more detailed and spatially explicit description of change. An analysis of the effects of spatial scales on the pattern metrics indicated that they were relatively insensitive to scale at spatial resolutions less than 50 m, but that most metrics became sensitive at coarser resolutions, a finding which has implications for mapping of mangroves based on remotely sensed data. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Formal specifications can precisely and unambiguously define the required behavior of a software system or component. However, formal specifications are complex artifacts that need to be verified to ensure that they are consistent, complete, and validated against the requirements. Specification testing or animation tools exist to assist with this by allowing the specifier to interpret or execute the specification. However, currently little is known about how to do this effectively. This article presents a framework and tool support for the systematic testing of formal, model-based specifications. Several important generic properties that should be satisfied by model-based specifications are first identified. Following the idea of mutation analysis, we then use variants or mutants of the specification to check that these properties are satisfied. The framework also allows the specifier to test application-specific properties. All properties are tested for a range of states that are defined by the tester in the form of a testgraph, which is a directed graph that partially models the states and transitions of the specification being tested. Tool support is provided for the generation of the mutants, for automatically traversing the testgraph and executing the test cases, and for reporting any errors. The framework is demonstrated on a small specification and its application to three larger specifications is discussed. Experience indicates that the framework can be used effectively to test small to medium-sized specifications and that it can reveal a significant number of problems in these specifications.
Resumo:
The basis of this work was to investigate the relative environmental impacts of various power generators knowing that all plants are located in totally different environments and that different receptors will experience different impacts. Based on IChemE sustainability metrics paradigm, we calculated potential environmental indicators (P-EI) that represent the environmental burden of masses of potential pollutants discharged into different receiving media. However, a P-EI may not be of significance, as it may not be expressed at all in different conditions, so to try and include some receiver significance we developed a methodology to take into account some specific environmental indicators (S-EI) that refer to the environmental attributes of a specific site. In this context, we acquired site specific environmental data related to the airsheds and water catchment areas in different locations for a limited number of environmental indicators such as human health (carcinogenic) effects, atmospheric acidification, photochemical (ozone) smog and eutrophication. The S-EI results from this particular analysis show that atmospheric acidification has highest impact value while health risks due to fly ash emissions are considered not to be as significant. This is due to the fact that many coal power plants in Australia are located in low population density air sheds. The contribution of coal power plants to photochemical (ozone) smog and eutrophication were not significant. In this study, we have considered emission related data trends to reflect technology performance (e.g., P-EI indicators) while a real sustainability metric can be associated only with the specific environmental conditions of the relevant sites (e.g., S-EI indicators).
Resumo:
A method and a corresponding tool is described which assist design recovery and program understanding by recognising instances of design patterns semi-automatically. The approach taken is specifically designed to overcome the existing scalability problems caused by many design and implementation variants of design pattern instances. Our approach is based on a new recognition algorithm which works incrementally rather than trying to analyse a possibly large software system in one pass without any human intervention. The new algorithm exploits domain and context knowledge given by a reverse engineer and by a special underlying data structure, namely a special form of an annotated abstract syntax graph. A comparative and quantitative evaluation of applying the approach to the Java AWT and JGL libraries is also given.
Resumo:
A general, fast wavelet-based adaptive collocation method is formulated for heat and mass transfer problems involving a steep moving profile of the dependent variable. The technique of grid adaptation is based on sparse point representation (SPR). The method is applied and tested for the case of a gas–solid non-catalytic reaction in a porous solid at high Thiele modulus. Accurate and convergent steep profiles are obtained for Thiele modulus as large as 100 for the case of slab and found to match the analytical solution.
Resumo:
This paper presents a new relative measure of signal complexity, referred to here as relative structural complexity, which is based on the matching pursuit (MP) decomposition. By relative, we refer to the fact that this new measure is highly dependent on the decomposition dictionary used by MP. The structural part of the definition points to the fact that this new measure is related to the structure, or composition, of the signal under analysis. After a formal definition, the proposed relative structural complexity measure is used in the analysis of newborn EEG. To do this, firstly, a time-frequency (TF) decomposition dictionary is specifically designed to compactly represent the newborn EEG seizure state using MP. We then show, through the analysis of synthetic and real newborn EEG data, that the relative structural complexity measure can indicate changes in EEG structure as it transitions between the two EEG states; namely seizure and background (non-seizure).
Resumo:
Market-based transmission expansion planning gives information to investors on where is the most cost efficient place to invest and brings benefits to those who invest in this grid. However, both market issue and power system adequacy problems are system planers’ concern. In this paper, a hybrid probabilistic criterion of Expected Economical Loss (EEL) is proposed as an index to evaluate the systems’ overall expected economical losses during system operation in a competitive market. It stands on both investors’ and planner’s point of view and will further improves the traditional reliability cost. By applying EEL, it is possible for system planners to obtain a clear idea regarding the transmission network’s bottleneck and the amount of losses arises from this weak point. Sequentially, it enables planners to assess the worth of providing reliable services. Also, the EEL will contain valuable information for moneymen to undertake their investment. This index could truly reflect the random behaviors of power systems and uncertainties from electricity market. The performance of the EEL index is enhanced by applying Normalized Coefficient of Probability (NCP), so it can be utilized in large real power systems. A numerical example is carried out on IEEE Reliability Test System (RTS), which will show how the EEL can predict the current system bottleneck under future operational conditions and how to use EEL as one of planning objectives to determine future optimal plans. A well-known simulation method, Monte Carlo simulation, is employed to achieve the probabilistic characteristic of electricity market and Genetic Algorithms (GAs) is used as a multi-objective optimization tool.
Resumo:
This paper reports on a system for automated agent negotiation, based on a formal and executable approach to capture the behavior of parties involved in a negotiation. It uses the JADE agent framework, and its major distinctive feature is the use of declarative negotiation strategies. The negotiation strategies are expressed in a declarative rules language, defeasible logic, and are applied using the implemented system DR-DEVICE. The key ideas and the overall system architecture are described, and a particular negotiation case is presented in detail.
Resumo:
In previous works we showed how to combine propositional multimodal logics using Gabbay's \emph{fibring} methodology. In this paper we extend the above mentioned works by providing a tableau-based proof technique for the combined/fibred logics. To achieve this end we first make a comparison between two types of tableau proof systems, (\emph{graph} $\&$ \emph{path}), with the help of a scenario (The Friend's Puzzle). Having done that we show how to uniformly construct a tableau calculus for the combined logic using Governatori's labelled tableau system \KEM. We conclude with a discussion on \KEM's features.
Resumo:
We describe a novel method of fabricating atom chips that are well suited to the production and manipulation of atomic Bose–Einstein condensates. Our chip was created using a silver foil and simple micro-cutting techniques without the need for photolithography. It can sustain larger currents than conventional chips, and is compatible with the patterning of complex trapping potentials. A near pure Bose–Einstein condensate of 4 × 104 87Rb atoms has been created in a magnetic microtrap formed by currents through wires on the chip. We have observed the fragmentation of atom clouds in close proximity to the silver conductors. The fragmentation has different characteristic features to those seen with copper conductors.