954 resultados para Consistency checking


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Doutor em Engenharia Informática

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software architecture erodes over time and needs to be constantly monitored to be kept consistent with its original intended design. Consistency is rarely monitored using automated techniques. The cost associated to such an activity is typically not considered proportional to its benefits. To improve this situation, we propose Dicto, a uniform DSL for specifying architectural invariants. This language is designed to reduce the cost of consistency checking by offering a framework in which existing validation tools can be matched to newly-defined language constructs. In this paper we discuss how such a DSL can be qualitatively and qualitatively evaluated in practice.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A key argument for modeling knowledge in ontologies is the easy re-use and re-engineering of the knowledge. However, beside consistency checking, current ontology engineering tools provide only basic functionalities for analyzing ontologies. Since ontologies can be considered as (labeled, directed) graphs, graph analysis techniques are a suitable answer for this need. Graph analysis has been performed by sociologists for over 60 years, and resulted in the vivid research area of Social Network Analysis (SNA). While social network structures in general currently receive high attention in the Semantic Web community, there are only very few SNA applications up to now, and virtually none for analyzing the structure of ontologies. We illustrate in this paper the benefits of applying SNA to ontologies and the Semantic Web, and discuss which research topics arise on the edge between the two areas. In particular, we discuss how different notions of centrality describe the core content and structure of an ontology. From the rather simple notion of degree centrality over betweenness centrality to the more complex eigenvector centrality based on Hermitian matrices, we illustrate the insights these measures provide on two ontologies, which are different in purpose, scope, and size.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Software architecture plays an essential role in the high level description of a system design, where the structure and communication are emphasized. Despite its importance in the software engineering process, the lack of formal description and automated verification hinders the development of good software architecture models. In this paper, we present an approach to support the rigorous design and verification of software architecture models using the semantic web technology. We view software architecture models as ontology representations, where their structures and communication constraints are captured by the Web Ontology Language (OWL) and the Semantic Web Rule Language (SWRL). Specific configurations on the design are represented as concrete instances of the ontology, to which their structures and dynamic behaviors must conform. Furthermore, ontology reasoning tools can be applied to perform various automated verification on the design to ensure correctness, such as consistency checking, style recognition, and behavioral inference.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Common sense tells us that the future is an essential element in any strategy. In addition, there is a good deal of literature on scenario planning, which is an important tool in considering the future in terms of strategy. However, in many organizations there is serious resistance to the development of scenarios, and they are not broadly implemented by companies. But even organizations that do not rely heavily on the development of scenarios do, in fact, construct visions to guide their strategies. But it might be asked, what happens when this vision is not consistent with the future? To address this problem, the present article proposes a method for checking the content and consistency of an organization`s vision of the future, no matter how it was conceived. The proposed method is grounded on theoretical concepts from the field of future studies, which are described in this article. This study was motivated by the search for developing new ways of improving and using scenario techniques as a method for making strategic decisions. The method was then tested on a company in the field of information technology in order to check its operational feasibility. The test showed that the proposed method is, in fact, operationally feasible and was capable of analyzing the vision of the company being studied, indicating both its shortcomings and points of inconsistency. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Common sense tells us that the future is an essential element in any strategy. In addition, there is a good deal of literature on scenario planning, which is an important tool in considering the future in terms of strategy. However, in many organizations there is serious resistance to the development of scenarios, and they are not broadly implemented by companies. But even organizations that do not rely heavily on the development of scenarios do, in fact, construct visions to guide their strategies. But it might be asked, what happens when this vision is not consistent with the future? To address this problem, the present article proposes a method for checking the content and consistency of an organization's vision of the future, no matter how it was conceived. The proposed method is grounded on theoretical concepts from the field of future studies, which are described in this article. This study was motivated by the search for developing new ways of improving and using scenario techniques as a method for making strategic decisions. The method was then tested on a company in the field of information technology in order to check its operational feasibility. The test showed that the proposed method is, in fact, operationally feasible and was capable of analyzing the vision of the company being studied, indicating both its shortcomings and points of inconsistency. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software erosion can be controlled by periodically checking for consistency between the de facto architecture and its theoretical counterpart. Studies show that this process is often not automated and that developers still rely heavily on manual reviews, despite the availability of a large number of tools. This is partially due to the high cost involved in setting up and maintaining tool-specific and incompatible test specifications that replicate otherwise documented invariants. To reduce this cost, our approach consists in unifying the functionality provided by existing tools under the umbrella of a common business-readable DSL. By using a declarative language, we are able to write tool-agnostic rules that are simple enough to be understood by non-technical stakeholders and, at the same time, can be interpreted as a rigorous specification for checking architecture conformance

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The epidemiology of a disease describes numbers of people becoming incident, being prevalent, recovering, surviving, and dying from the disease or from other causes. As a matter of accounting principle, the inflow, stock, and outflows must be compatible, and if we could observe completely every person involved, the epidemiologic estimates describing the disease would be consistent. Lack of consistency is an indicator for possible measurement error. Methods: We examined the consistency of estimates of incidence, prevalence, and excess mortality of dementia from the Rotterdam Study. We used the incidence and excess mortality estimates to calculate with a mathematical disease model a predicted prevalence, and compared the predicted to the observed prevalence. Results: Predicted prevalence is in most age groups lower than observed, and the difference between them is significant for some age groups. Conclusions: The observed discrepancy could be due to overestimates of prevalence or excess mortality, or an underestimate of incidence, or a combination of all three. We conclude from an analysis of possible causes that it is not possible to say which contributes most to the discrepancy. Estimating dementia incidence in an aging cohort presents a dilemma: with a short follow-up border-line incident cases are easily missed, and with longer follow-up measurement problems increase due to the associated aging of the cohort. Checking for consistency is a useful strategy to signal possible measurement error, but some sources of error may be impossible to avoid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss how integrity consistency constraints between different UML models can be precisely defined at a language level. In doing so, we introduce a formal object-oriented metamodeling approach. In the approach, integrity consistency constraints between UML models are defined in terms of invariants of the UML model elements used to define the models at the language-level. Adopting a formal approach, constraints are formally defined using Object-Z. We demonstrate how integrity consistency constraints for UML models can be precisely defined at the language-level and once completed, the formal description of the consistency constraints will be a precise reference of checking consistency of UML models as well as for tool development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monte Carlo track structures (MCTS) simulations have been recognized as useful tools for radiobiological modeling. However, the authors noticed several issues regarding the consistency of reported data. Therefore, in this work, they analyze the impact of various user defined parameters on simulated direct DNA damage yields. In addition, they draw attention to discrepancies in published literature in DNA strand break (SB) yields and selected methodologies. The MCTS code Geant4-DNA was used to compare radial dose profiles in a nanometer-scale region of interest (ROI) for photon sources of varying sizes and energies. Then, electron tracks of 0.28 keV-220 keV were superimposed on a geometric DNA model composed of 2.7 × 10(6) nucleosomes, and SBs were simulated according to four definitions based on energy deposits or energy transfers in DNA strand targets compared to a threshold energy ETH. The SB frequencies and complexities in nucleosomes as a function of incident electron energies were obtained. SBs were classified into higher order clusters such as single and double strand breaks (SSBs and DSBs) based on inter-SB distances and on the number of affected strands. Comparisons of different nonuniform dose distributions lacking charged particle equilibrium may lead to erroneous conclusions regarding the effect of energy on relative biological effectiveness. The energy transfer-based SB definitions give similar SB yields as the one based on energy deposit when ETH ≈ 10.79 eV, but deviate significantly for higher ETH values. Between 30 and 40 nucleosomes/Gy show at least one SB in the ROI. The number of nucleosomes that present a complex damage pattern of more than 2 SBs and the degree of complexity of the damage in these nucleosomes diminish as the incident electron energy increases. DNA damage classification into SSB and DSB is highly dependent on the definitions of these higher order structures and their implementations. The authors' show that, for the four studied models, different yields are expected by up to 54% for SSBs and by up to 32% for DSBs, as a function of the incident electrons energy and of the models being compared. MCTS simulations allow to compare direct DNA damage types and complexities induced by ionizing radiation. However, simulation results depend to a large degree on user-defined parameters, definitions, and algorithms such as: DNA model, dose distribution, SB definition, and the DNA damage clustering algorithm. These interdependencies should be well controlled during the simulations and explicitly reported when comparing results to experiments or calculations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantum field theory with an external background can be considered as a consistent model only if backreaction is relatively small with respect to the background. To find the corresponding consistency restrictions on an external electric field and its duration in QED and QCD, we analyze the mean-energy density of quantized fields for an arbitrary constant electric field E, acting during a large but finite time T. Using the corresponding asymptotics with respect to the dimensionless parameter eET(2), one can see that the leading contributions to the energy are due to the creation of particles by the electric field. Assuming that these contributions are small in comparison with the energy density of the electric background, we establish the above-mentioned restrictions, which determine, in fact, the time scales from above of depletion of an electric field due to the backreaction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Cluster Variation Method (CVM), introduced over 50 years ago by Prof. Dr. Ryoichi Kikuchi, is applied to the thermodynamic modeling of the BCC Cr-Fe system in the irregular tetrahedron approximation, using experimental thermochemical data as initial input for accessing the model parameters. The results are checked against independent data on the low-temperature miscibility gap, using increasingly accurate thermodynamic models, first by the inclusion of the magnetic degrees of freedom of iron and then also by the inclusion of the magnetic degrees of freedom of chromium. It is shown that a reasonably accurate description of the phase diagram at the iron-rich side (i.e. the miscibility gap borders and the Curie line) is obtained, but only at expense of the agreement with the above mentioned thermochemical data. Reasons for these inconsistencies are discussed, especially with regard to the need of introducing vibrational degrees of freedom in the CVM model. (C) 2008 Elsevier Ltd. All rights reserved.