959 resultados para formal model
Resumo:
This work shows a project method proposed to design and build software components from the software functional m del up to assembly code level in a rigorous fashion. This method is based on the B method, which was developed with support and interest of British Petroleum (BP). One goal of this methodology is to contribute to solve an important problem, known as The Verifying Compiler. Besides, this work describes a formal model of Z80 microcontroller and a real system of petroleum area. To achieve this goal, the formal model of Z80 was developed and documented, as it is one key component for the verification upto the assembly level. In order to improve the mentioned methodology, it was applied on a petroleum production test system, which is presented in this work. Part of this technique is performed manually. However, almost of these activities can be automated by a specific compiler. To build such compiler, the formal modelling of microcontroller and modelling of production test system should provide relevant knowledge and experiences to the design of a new compiler. In ummary, this work should improve the viability of one of the most stringent criteria for formal verification: speeding up the verification process, reducing design time and increasing the quality and reliability of the product of the final software. All these qualities are very important for systems that involve serious risks or in need of a high confidence, which is very common in the petroleum industry
Resumo:
O desenvolvimento de um modelo formal da estratégia e da dinâmica do jogo constitui uma contribuição científica original no contexto dos esportes coletivos de invasão. O procedimento construtivo de uma estratégia definido no modelo é composto de cinco conjuntos de elementos ordenados em níveis hierárquicos, que suportam o desenho de estratégias capazes de orientar adequadamente uma equipe em um jogo. Uma vez definido o modelo da estratégia, a formalização da dinâmica do jogo permite sua contextualização no momento de sua aplicação para orientar os jogadores no confronto. O jogo teve sua estrutura decomposta e suas propriedades dinâmicas fundamentais foram definidas. Dessa forma, a modelagem da dinâmica da oposição e da estratégia se complementam, pois definem os momentos nos quais a informação estratégica é utilizada pelos jogadores. A estrutura formal apresentada inaugura uma linha de pesquisa que poderá contribuir para limitar a subjetividade na definição dos critérios de análise de futuros desenhos experimentais, levando à interpretações e comparações mais acuradas dos resultados dos estudos.
Resumo:
The doctrine of fair use allows limited copying of creative works based on the rationale that copyright holders would consent to such uses if bargaining were possible. This paper develops a formal model of fair use in an effort to derive the efficient legal standard for applying the doctrine. The model interprets copies and originals as differentiated products and defines fair use as a threshold separating permissible copying from infringement. The analysis highlights the role of technology in shaping the efficient standard. Discussion of several key cases illustrates the applicability of the model.
Resumo:
Este trabajo de investigación presenta un modelo de garantía de calidad en educación Alternativa en modalidad virtual para Pueblos Indígenas del departamento de La Paz, Bolivia. Se plantea el modelo teórico constituido por componentes que emergen de la problemática enunciada, complementado con un análisis comparativo de modelos de calidad en educación virtual y la selección de variables e indicadores. Se da también una descripción del modelo causal explicativo inicial, todo esto utilizando elementos adecuados a las características de Pueblos indígenas del Dpto. de La Paz. Más adelante, se detalla la experiencia de capacitación en TIC’s a dos poblaciones indígenas aplicando el modelo planteado, lo que ha permitido hacer una validación empírica de este. Asimismo, se da a conocer los resultados que arrojaron las encuestas de calidad provenientes de la aplicación del modelo y el llenado correspondiente de las mismas. A partir de estos datos se ha realizado los análisis estadísticos pertinentes para una validación formal del modelo, estructurando una base de datos con la que se logra validar el modelo a través del análisis confirmatorio que conduce a verificar el ajuste de los datos muestrales con el modelo propuesto. ABSTRACT This research presents a model of quality assurance in Alternative education in virtual mode for indigenous communities in the department of La Paz, Bolivia. The theoretical model consisting of components that emerge from the problem expressed, supplemented by a comparative analysis of quality models in virtual education and the selection of variables and indicators arise. It also gives a description of the initial causal explanatory model, all using suited to the characteristics of indigenous communities in the Department of La Paz. Later, the experience of ICT training in two indigenous peoples applying the detailed proposed model, which has allowed for an empirical validation of this. It also discloses the results yielded quality surveys from the application of the model and the corresponding filling them. From these data it was performed statistical analysis relevant to a formal model validation, structuring a database with that achieved validate the model through confirmatory analysis leading to check the setting of the sample data with the model proposed.
Resumo:
Formal specifications can precisely and unambiguously define the required behavior of a software system or component. However, formal specifications are complex artifacts that need to be verified to ensure that they are consistent, complete, and validated against the requirements. Specification testing or animation tools exist to assist with this by allowing the specifier to interpret or execute the specification. However, currently little is known about how to do this effectively. This article presents a framework and tool support for the systematic testing of formal, model-based specifications. Several important generic properties that should be satisfied by model-based specifications are first identified. Following the idea of mutation analysis, we then use variants or mutants of the specification to check that these properties are satisfied. The framework also allows the specifier to test application-specific properties. All properties are tested for a range of states that are defined by the tester in the form of a testgraph, which is a directed graph that partially models the states and transitions of the specification being tested. Tool support is provided for the generation of the mutants, for automatically traversing the testgraph and executing the test cases, and for reporting any errors. The framework is demonstrated on a small specification and its application to three larger specifications is discussed. Experience indicates that the framework can be used effectively to test small to medium-sized specifications and that it can reveal a significant number of problems in these specifications.
Resumo:
This paper examines the economic significance of return predictability in Australian equities. In light of considerable model uncertainty, formal model-selection criteria are used to choose a specification for the predictive model. A portfolio-switching strategy is implemented according to model predictions. Relative to a buy-and-hold market investment, the returns to the portfolio-switching strategy are impressive under several model-selection criteria, even after accounting for transaction costs. However, as these findings are not robust across other model-selection criteria examined, it is difficult to conclude that the degree of return predictability is economically significant.
Resumo:
The formal model of natural language processing in knowledge-based information systems is considered. The components realizing functions of offered formal model are described.
Resumo:
The purpose of this research was to apply model checking by using a symbolic model checker on Predicate Transition Nets (PrT Nets). A PrT Net is a formal model of information flow which allows system properties to be modeled and analyzed. The aim of this thesis was to use the modeling and analysis power of PrT nets to provide a mechanism for the system model to be verified. Symbolic Model Verifier (SMV) was the model checker chosen in this thesis, and in order to verify the PrT net model of a system, it was translated to SMV input language. A software tool was implemented which translates the PrT Net into SMV language, hence enabling the process of model checking. The system includes two parts: the PrT net editor where the representation of a system can be edited, and the translator which converts the PrT net into an SMV program.
Resumo:
Trials in a temporal two-interval forced-choice discrimination experiment consist of two sequential intervals presenting stimuli that differ from one another as to magnitude along some continuum. The observer must report in which interval the stimulus had a larger magnitude. The standard difference model from signal detection theory analyses poses that order of presentation should not affect the results of the comparison, something known as the balance condition (J.-C. Falmagne, 1985, in Elements of Psychophysical Theory). But empirical data prove otherwise and consistently reveal what Fechner (1860/1966, in Elements of Psychophysics) called time-order errors, whereby the magnitude of the stimulus presented in one of the intervals is systematically underestimated relative to the other. Here we discuss sensory factors (temporary desensitization) and procedural glitches (short interstimulus or intertrial intervals and response bias) that might explain the time-order error, and we derive a formal model indicating how these factors make observed performance vary with presentation order despite a single underlying mechanism. Experimental results are also presented illustrating the conventional failure of the balance condition and testing the hypothesis that time-order errors result from contamination by the factors included in the model.
Resumo:
We introduce a formal model for certificateless authenticated key exchange (CL-AKE) protocols. Contrary to what might be expected, we show that the natural combination of an ID-based AKE protocol with a public key based AKE protocol cannot provide strong security. We provide the first one-round CL-AKE scheme proven secure in the random oracle model. We introduce two variants of the Diffie-Hellman trapdoor the introduced by \cite{DBLP:conf/eurocrypt/CashKS08}. The proposed key agreement scheme is secure as long as each party has at least one uncompromised secret. Thus, our scheme is secure even if the key generation centre learns the ephemeral secrets of both parties.
Resumo:
The human-technology nexus is a strong focus of Information Systems (IS) research; however, very few studies have explored this phenomenon in anaesthesia. Anaesthesia has a long history of adoption of technological artifacts, ranging from early apparatus to present-day information systems such as electronic monitoring and pulse oximetry. This prevalence of technology in modern anaesthesia and the rich human-technology relationship provides a fertile empirical setting for IS research. This study employed a grounded theory approach that began with a broad initial guiding question and, through simultaneous data collection and analysis, uncovered a core category of technology appropriation. This emergent basic social process captures a central activity of anaesthestists and is supported by three major concepts: knowledge-directed medicine, complementary artifacts and culture of anaesthesia. The outcomes of this study are: (1) a substantive theory that integrates the aforementioned concepts and pertains to the research setting of anaesthesia and (2) a formal theory, which further develops the core category of appropriation from anaesthesia-specific to a broader, more general perspective. These outcomes fulfill the objective of a grounded theory study, being the formation of theory that describes and explains observed patterns in the empirical field. In generalizing the notion of appropriation, the formal theory is developed using the theories of Karl Marx. This Marxian model of technology appropriation is a three-tiered theoretical lens that examines appropriation behaviours at a highly abstract level, connecting the stages of natural, species and social being to the transition of a technology-as-artifact to a technology-in-use via the processes of perception, orientation and realization. The contributions of this research are two-fold: (1) the substantive model contributes to practice by providing a model that describes and explains the human-technology nexus in anaesthesia, and thereby offers potential predictive capabilities for designers and administrators to optimize future appropriations of new anaesthetic technological artifacts; and (2) the formal model contributes to research by drawing attention to the philosophical foundations of appropriation in the work of Marx, and subsequently expanding the current understanding of contemporary IS theories of adoption and appropriation.
Resumo:
In dynamic and uncertain environments such as healthcare, where the needs of security and information availability are difficult to balance, an access control approach based on a static policy will be suboptimal regardless of how comprehensive it is. The uncertainty stems from the unpredictability of users’ operational needs as well as their private incentives to misuse permissions. In Role Based Access Control (RBAC), a user’s legitimate access request may be denied because its need has not been anticipated by the security administrator. Alternatively, even when the policy is correctly specified an authorised user may accidentally or intentionally misuse the granted permission. This paper introduces a novel approach to access control under uncertainty and presents it in the context of RBAC. By taking insights from the field of economics, in particular the insurance literature, we propose a formal model where the value of resources are explicitly defined and an RBAC policy (entailing those predictable access needs) is only used as a reference point to determine the price each user has to pay for access, as opposed to representing hard and fast rules that are always rigidly applied.
Resumo:
Many existing information retrieval models do not explicitly take into account in- formation about word associations. Our approach makes use of rst and second order relationships found in natural language, known as syntagmatic and paradigmatic associ- ations, respectively. This is achieved by using a formal model of word meaning within the query expansion process. On ad hoc retrieval, our approach achieves statistically sig- ni cant improvements in MAP (0.158) and P@20 (0.396) over our baseline model. The ERR@20 and nDCG@20 of our system was 0.249 and 0.192 respectively. Our results and discussion suggest that information about both syntagamtic and paradigmatic associa- tions can assist with improving retrieval eectiveness on ad hoc retrieval.
Resumo:
Many existing information retrieval models do not explicitly take into account in- formation about word associations. Our approach makes use of rst and second order relationships found in natural language, known as syntagmatic and paradigmatic associ- ations, respectively. This is achieved by using a formal model of word meaning within the query expansion process. On ad hoc retrieval, our approach achieves statistically sig- ni cant improvements in MAP (0.158) and P@20 (0.396) over our baseline model. The ERR@20 and nDCG@20 of our system was 0.249 and 0.192 respectively. Our results and discussion suggest that information about both syntagamtic and paradigmatic associa- tions can assist with improving retrieval eectiveness on ad hoc retrieval.
Resumo:
In the last years several works have investigated a formal model for Information Retrieval (IR) based on the mathematical formalism underlying quantum theory. These works have mainly exploited geometric and logical–algebraic features of the quantum formalism, for example entanglement, superposition of states, collapse into basis states, lattice relationships. In this poster I present an analogy between a typical IR scenario and the double slit experiment. This experiment exhibits the presence of interference phenomena between events in a quantum system, causing the Kolmogorovian law of total probability to fail. The analogy allows to put forward the routes for the application of quantum probability theory in IR. However, several questions need still to be addressed; they will be the subject of my PhD research