965 resultados para formal methods


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conceptual combination performs a fundamental role in creating the broad range of compound phrases utilised in everyday language. While the systematicity and productivity of language provide a strong argument in favour of assuming compositionality, this very assumption is still regularly questioned in both cognitive science and philosophy. This article provides a novel probabilistic framework for assessing whether the semantics of conceptual combinations are compositional, and so can be considered as a function of the semantics of the constituent concepts, or not. Rather than adjudicating between different grades of compositionality, the framework presented here contributes formal methods for determining a clear dividing line between compositional and non-compositional semantics. Compositionality is equated with a joint probability distribution modelling how the constituent concepts in the combination are interpreted. Marginal selectivity is emphasised as a pivotal probabilistic constraint for the application of the Bell/CH and CHSH systems of inequalities (referred to collectively as Bell-type). Non-compositionality is then equated with either a failure of marginal selectivity, or, in the presence of marginal selectivity, with a violation of Bell-type inequalities. In both non-compositional scenarios, the conceptual combination cannot be modelled using a joint probability distribution with variables corresponding to the interpretation of the individual concepts. The framework is demonstrated by applying it to an empirical scenario of twenty-four non-lexicalised conceptual combinations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Service researchers have repeatedly claimed that firms should acquire customer information in order to develop services that fit customer needs. Despite this, studies that would concentrate on the actual use of customer information in service development are lacking. The present study fulfils this research gap by investigating information use during a service development process. It demonstrates that use is not a straightforward task that automatically follows the acquisition of customer information. In fact, out of the six identified types of use, four represent non usage of customer information. Hence, the study demonstrates that the acquisition of customer information does not guarantee that the information will actually be used in development. The current study used an ethnographic approach. Consequently, the study was conducted in the field in real time over an extensive period of 13 months. Participant observation allowed direct access to the investigated phenomenon, i.e. the different types of use by the observed development project members were captured while they emerged. In addition, interviews, informal discussions and internal documents were used to gather data. A development process of a bank’s website constituted the empirical context of the investigation. This ethnography brings novel insights to both academia and practice. It critically questions the traditional focus on the firm’s acquisition of customer information and suggests that this focus ought to be expanded to the actual use of customer information. What is the point in acquiring costly customer information if it is not used in the development? Based on the findings of this study, a holistic view on customer information, “information in use” is generated. This view extends the traditional view of customer information in three ways: the source, timing and form of data collection. First, the study showed that the customer information can come explicitly from the customer, from speculation among the developers or it can already exist implicitly. Prior research has mainly focused on the customer as the information provider and the explicit source to turn to for information. Second, the study identified that the used and non-used customer information was acquired both previously, and currently within the time frame of the focal development process, as well as potentially in the future. Prior research has primarily focused on the currently acquired customer information, i.e. within the timeframe of the development process. Third, the used and non-used customer information was both formally and informally acquired. In prior research a large number of sophisticated formal methods have been suggested for the acquisition of customer information to be used in development. By focusing on “information in use”, new knowledge on types of customer information that are actually used was generated. For example, the findings show that the formal customer information acquired during the development process is used less than customer information already existent within the firm. With this knowledge at hand, better methods to capture this more usable customer information can be developed. Moreover, the thesis suggests that by focusing stronger on use of customer information, service development processes can be restructured in order to facilitate the information that is actually used.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conformance testing focuses on checking whether an implementation. under test (IUT) behaves according to its specification. Typically, testers are interested it? performing targeted tests that exercise certain features of the IUT This intention is formalized as a test purpose. The tester needs a "strategy" to reach the goal specified by the test purpose. Also, for a particular test case, the strategy should tell the tester whether the IUT has passed, failed. or deviated front the test purpose. In [8] Jeron and Morel show how to compute, for a given finite state machine specification and a test purpose automaton, a complete test graph (CTG) which represents all test strategies. In this paper; we consider the case when the specification is a hierarchical state machine and show how to compute a hierarchical CTG which preserves the hierarchical structure of the specification. We also propose an algorithm for an online test oracle which avoids a space overhead associated with the CTG.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Memory models of shared memory concurrent programs define the values a read of a shared memory location is allowed to see. Such memory models are typically weaker than the intuitive sequential consistency semantics to allow efficient execution. In this paper, we present WOMM (abbreviation for Weak Operational Memory Model) that formally unifies two sources of weak behavior in hardware memory models: reordering of instructions and weakly consistent memory. We show that a large number of optimizations are allowed by WOMM. We also show that WOMM is weaker than a number of hardware memory models. Consequently, if a program behaves correctly under WOMM, it will be correct with respect to those hardware memory models. Hence, WOMM can be used as a formally specified abstraction of the hardware memory models. Moreover; unlike most weak memory models, WOMM is described using operational semantics, making it easy to integrate into a model checker for concurrent programs. We further show that WOMM has an important property - it has sequential consistency semantics for datarace-free programs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We give a detailed construction of a finite-state transition system for a com-connected Message Sequence Graph. Though this result is well-known in the literature and forms the basis for the solution to several analysis and verification problems concerning MSG specifications, the constructions given in the literature are either not amenable to implementation, or imprecise, or simply incorrect. In contrast we give a detailed construction along with a proof of its correctness. Our transition system is amenable to implementation, and can also be used for a bounded analysis of general (not necessarily com-connected) MSG specifications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Denial-of-service (DoS) attacks form a very important category of security threats that are prevalent in MIPv6 (mobile internet protocol version 6) today. Many schemes have been proposed to alleviate such threats, including one of our own [9]. However, reasoning about the correctness of such protocols is not trivial. In addition, new solutions to mitigate attacks may need to be deployed in the network on a frequent basis as and when attacks are detected, as it is practically impossible to anticipate all attacks and provide solutions in advance. This makes it necessary to validate the solutions in a timely manner before deployment in the real network. However, threshold schemes needed in group protocols make analysis complex. Model checking threshold-based group protocols that employ cryptography have not been successful so far. Here, we propose a new simulation based approach for validation using a tool called FRAMOGR that supports executable specification of group protocols that use cryptography. FRAMOGR allows one to specify attackers and track probability distributions of values or paths. We believe that infrastructure such as FRAMOGR would be required in future for validating new group based threshold protocols that may be needed for making MIPv6 more robust.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We propose a new abstract domain for static analysis of executable code. Concrete states are abstracted using circular linear progressions (CLPs). CLPs model computations using a finite word length as is seen in any real life processor. The finite abstraction allows handling overflow scenarios in a natural and straight-forward manner. Abstract transfer functions have been defined for a wide range of operations which makes this domain easily applicable for analyzing code for a wide range of ISAs. CLPs combine the scalability of interval domains with the discreteness of linear congruence domains. We also present a novel, lightweight method to track linear equality relations between static objects that is used by the analysis to improve precision. The analysis is efficient, the total space and time overhead being quadratic in the number of static objects being tracked.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Continuous advances in VLSI technology have made implementation of very complicated systems possible. Modern System-on -Chips (SoCs) have many processors, IP cores and other functional units. As a result, complete verification of whole systems before implementation is becoming infeasible; hence it is likely that these systems may have some errors after manufacturing. This increases the need to find design errors in chips after fabrication. The main challenge for post-silicon debug is the observability of the internal signals. Post-silicon debug is the problem of determining what's wrong when the fabricated chip of a new design behaves incorrectly. This problem now consumes over half of the overall verification effort on large designs, and the problem is growing worse.Traditional post-silicon debug methods concentrate on functional parts of systems and provide mechanisms to increase the observability of internal state of systems. Those methods may not be sufficient as modern SoCs have lots of blocks (processors, IP cores, etc.) which are communicating with one another and communication is another source of design errors. This tutorial will be provide an insight into various observability enhancement techniques, on chip instrumentation techniques and use of high level models to support the debug process targeting both inside blocks and communication among them. It will also cover the use of formal methods to help debug process.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the challenges for accurately estimating Worst Case Execu-tion Time(WCET) of executables is to accurately predict their cache behaviour. Various techniques have been developed to predict the cache contents at different program points to estimate the execution time of memory-accessing instructions. One of the most widely used techniques is Abstract Interpretation based Must Analysis, which de-termines the cache blocks guaranteed to be present in the cache, and hence provides safe estimation of cache hits and misses. However,Must Analysis is highly imprecise, and platforms using Must Analysis have been known to produce blown-up WCET estimates. In our work, we propose to use May Analysis to assist the Must Analysis cache up-date and make it more precise. We prove the safety of our approach as well as provide examples where our Improved Must Analysis provides better precision. Further, we also detect a serious flaw in the original Persistence Analysis, and use Must and May Analysis to assist the Persistence Analysis cache update, to make it safe and more precise than the known solutions to the problem.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Counter systems are a well-known and powerful modeling notation for specifying infinite-state systems. In this paper we target the problem of checking liveness properties in counter systems. We propose two semi decision techniques towards this, both of which return a formula that encodes the set of reachable states of the system that satisfy a given liveness property. A novel aspect of our techniques is that they use reachability analysis techniques, which are well studied in the literature, as black boxes, and are hence able to compute precise answers on a much wider class of systems than previous approaches for the same problem. Secondly, they compute their results by iterative expansion or contraction, and hence permit an approximate solution to be obtained at any point. We state the formal properties of our techniques, and also provide experimental results using standard benchmarks to show the usefulness of our approaches. Finally, we sketch an extension of our liveness checking approach to check general CTL properties.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Atlantic menhaden, Brrvoortia tyrannus, the object of a major purse-seine fishery along the U.S. east coast, are landed at plants from northern Florida to central Maine. The National Marine Fisheries Service has sampled these landings since 1955 for length, weight, and age. Together with records of landings at each plant, the samples are used to estimate numbers of fish landed at each age. This report analyzes the sampling design in terms of probablity sampling theory. The design is c1assified as two-stage cluster sampling, the first stage consisting of purse-seine sets randomly selected from the population of all sets landed, and the second stage consisting of fish randomly selected from each sampled set. Implicit assumptions of this design are discussed with special attention to current sampling procedures. Methods are developed for estimating mean fish weight, numbers of fish landed, and age composition of the catch, with approximate 95% confidence intervals. Based on specific results from three ports (port Monmouth, N.J., Reedville, Va., and Beaufort, N.C.) for the 1979 fishing season, recommendations are made for improving sampling procedures to comply more exactly with assumptions of the sampling design. These recommendatlons include adopting more formal methods for randomizing set and fish selection, increasing the number of sets sampled, considering the bias introduced by unequal set sizes, and developing methods to optimize the use of funds and personnel. (PDF file contains 22 pages.)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The problem of s-d exchange scattering of conduction electrons off localized magnetic moments in dilute magnetic alloys is considered employing formal methods of quantum field theoretical scattering. It is shown that such a treatment not only allows for the first time, the inclusion of multiparticle intermediate states in single particle scattering equations but also results in extremely simple and straight forward mathematical analysis. These equations are proved to be exact in the thermodynamic limit. A self-consistent integral equation for electron self energy is derived and approximately solved. The ground state and physical parameters of dilute magnetic alloys are discussed in terms of the theoretical results. Within the approximation of single particle intermediate states our results reduce to earlier versions. The following additional features are found as a consequence of the inclusion of multiparticle intermediate states;

(i) A non analytic binding energy is pre sent for both, antiferromagnetic (J < o) and ferromagnetic (J > o) couplings of the electron plus impurity system.

(ii) The correct behavior of the energy difference of the conduction electron plus impurity system and the free electron system is found which is free of unphysical singularities present in earlier versions of the theories.

(iii) The ground state of the conduction electron plus impurity system is shown to be a many-body condensate state for J < o and J > o, both. However, a distinction is made between the usual terminology of "Singlet" and "Triplet" ground states and nature of our ground state.

(iv) It is shown that a long range ordering, leading to an ordering of the magnetic moments can result from a contact interaction such as the s-d exchange interaction.

(v) The explicit dependence of the excess specific heat of the Kondo systems is obtained and found to be linear in temperatures as T→ o and T ℓnT for 0.3 T_K ≤ T ≤ 0.6 T_K. A rise in (ΔC/T) for temperatures in the region 0 < T ≤ 0.1 T_K is predicted. These results are found to be in excellent agreement with experiments.

(vi) The existence of a critical temperature for Ferromagnetic coupling (J > o) is shown. On the basis of this the apparent contradiction of the simultaneous existence of giant moments and Kondo effect is resolved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Published and unpublished research findings regarding charter and headboat fishing customers from 11 studies were reviewed to provide a marketing data base for operators and to guide further research efforts. Generally, charter/headboat fishing is a male-oriented activity. Customers were between 30 and 55 years of age. Although both groups of anglers considered themselves to be experienced, charterboat anglers had fished for more years. Charter anglers fished more often with their families and headboat anglers more often with their friends. Charterboat anglers reported higher incomes than headboat anglers. Relaxation, having fun, and escaping from daily pressures were generally more important to both groups of anglers than motives relative to catching fish. Most anglers indicated that the skills and performance of the captain and crew contributed heavily to the overall evaluation of their fishing experience. Anglers were more heavily influenced to choose a particular captain or boat by informal advertising methods (i.e., word-of-mouth recommendations, reputation, and visits to the marina) than formal methods (i.e., advertisements, brochures, radio, and television). Charter anglers relied more on word-of-mouth recommendations and headboat customers were more influenced by previous experiences. Implications for further research are discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

规约在软件开发和验证中占有重要地位 .对于以一阶逻辑为基础的规约 ,可以利用有限模型构造技术对其执行并测试 .文中研究规约中某些特性的处理 ,包括存在量词以及二元关系的传递闭包 .对已有的一个构模工具进行扩充 ,发现了文献中的若干错误