957 resultados para first-order paraconsistent logic


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The human visual system is sensitive to second-order modulations of the local contrast (CM) or amplitude (AM) of a carrier signal. Second-order cues are detected independently of first-order luminance signals; however, it is not clear why vision should benet from second-order sensitivity. Analysis of the first-and second-order contents of natural images suggests that these cues tend to occur together, but their phase relationship varies. We have shown that in-phase combinations of LM and AM are perceived as a shaded corrugated surface whereas the anti-phase combination can be seen as corrugated when presented alone or as a flat material change when presented in a plaid containing the in-phase cue. We now extend these findings using new stimulus types and a novel haptic matching task. We also introduce a computational model based on initially separate first-and second-order channels that are combined within orientation and subsequently across orientation to produce a shading signal. Contrast gain control allows the LM + AM cue to suppress responses to the LM-AM when presented in a plaid. Thus, the model sees LM -AM as flat in these circumstances. We conclude that second-order vision plays a key role in disambiguating the origin of luminance changes within an image. © ARVO.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Autonomic systems are required to adapt continually to changing environments and user goals. This process involves the real-Time update of the system's knowledge base, which should therefore be stored in a machine-readable format and automatically checked for consistency. OWL ontologies meet both requirements, as they represent collections of knowl- edge expressed in FIrst order logic, and feature embedded reasoners. To take advantage of these OWL ontology char- acteristics, this PhD project will devise a framework com- prising a theoretical foundation, tools and methods for de- veloping knowledge-centric autonomic systems. Within this framework, the knowledge storage and maintenance roles will be fulfilled by a specialised class of OWL ontologies. ©2014 ACM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nonmonotonic Logics such as Autoepistemic Logic, Reflective Logic, and Default Logic, are usually defined in terms of set-theoretic fixed-point equations defined over deductively closed sets of sentences of First Order Logic. Such systems may also be represented as necessary equivalences in a Modal Logic stronger than S5 with the added advantage that such representations may be generalized to allow quantified variables crossing modal scopes resulting in a Quantified Autoepistemic Logic, a Quantified Autoepistemic Kernel, a Quantified Reflective Logic, and a Quantified Default Logic. Quantifiers in all these generalizations obey all the normal laws of logic including both the Barcan formula and its converse. Herein, we address the problem of solving some necessary equivalences containing universal quantifiers over modal scopes. Solutions obtained by these methods are then compared to related results obtained in the literature by Circumscription in Second Order Logic since the disjunction of all the solutions of a necessary equivalence containing just normal defaults in these Quantified Logics, is equivalent to that system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AMS subject classification: Primary 34A60, Secondary 49K24.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62G32, 62G20.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present, for the first time, a detailed investigation of the impact of second order co-propagating Raman pumping on long-haul 100G WDM DP-QPSK coherent transmission of up to 7082 km using Raman fibre laser based configurations. Signal power and noise distributions along the fibre for each pumping scheme were characterised both numerically and experimentally. Based on these pumping schemes, the Q factor penalties versus co-pump power ratios were experimentally measured and quantified. A significant Q factor penalty of up to 4.15 dB was observed after 1666 km using symmetric bidirectional pumping, compared with counter-pumping only. Our results show that whilst using co-pumping minimises the intra-cavity signal power variation and amplification noise, the Q factor penalty with co-pumping was too great for any advantage to be seen. The relative intensity noise (RIN) characteristics of the induced fibre laser and the output signal, and the intra-cavity RF spectra of the fibre laser are also presented. We attribute the Q factor degradation to RIN induced penalty due to RIN being transferred from the first order fibre laser and second order co-pump to the signal. More importantly, there were two different fibre lasing regimes contributing to the amplification. It was random distributed feedback lasing when using counter-pumping only and conventional Fabry-Perot cavity lasing when using all bidirectional pumping schemes. This also results in significantly different performances due to different laser cavity lengths for these two classes of laser.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a description logic extending SROIQ (the description logic underlying OWL 2 DL) and at the same time encompassing some of the most prominent monotonic and nonmonotonic rule languages, in particular Datalog extended with the answer set semantics. Our proposal could be considered a substantial contribution towards fulfilling the quest for a unifying logic for the Semantic Web. As a case in point, two non-monotonic extensions of description logics considered to be of distinct expressiveness until now are covered in our proposal. In contrast to earlier such proposals, our language has the "look and feel" of a description logic and avoids hybrid or first-order syntaxes. © 2012 The Author(s).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We experimentally investigate three Raman fibre laser based amplification techniques with second-order bidirectional pumping. Relatively intensity noise (RIN) being transferred to the signal can be significantly suppressed by reducing first-order reflection near the input end. © 2015 OSA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic.^ This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. ^ There are two issues in using HLPNs—modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. ^ For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. ^ For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. ^ The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic. This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes extended nonlinear analytical models, third-order models, of compliant parallelogram mechanisms. These models are capable of capturing the accurate effects from the very large axial force within the transverse motion range of 10% of the beam length through incorporating the terms associated with the high-order (up to third-order) axial force. Firstly, the free-body diagram method is employed to derive the nonlinear analytical model for a basic compliant parallelogram mechanism based on load-displacement relations of a single beam, geometry compatibility conditions, and load-equilibrium conditions. The procedures for the forward solutions and inverse solutions are described. Nonlinear analytical models for guided compliant multi-beam parallelogram mechanisms are then obtained. A case study of the compound compliant parallelogram mechanism, composed of two basic compliant parallelogram mechanisms in symmetry, is further implemented. This work intends to estimate the internal axial force change, the transverse force change, and the transverse stiffness change with the transverse motion using the proposed third-order model in comparison with the first-order model proposed in the prior art. In addition, FEA (finite element analysis) results validate the accuracy of the third-order model for a typical example. It is shown that in the case study the slenderness ratio affects the result discrepancy between the third-order model and the first-order model significantly, and the third-order model can illustrate a non-monotonic transverse stiffness curve if the beam is thin enough.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. There are two issues in using HLPNs - modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose three research problems to explore the relations between trust and security in the setting of distributed computation. In the first problem, we study trust-based adversary detection in distributed consensus computation. The adversaries we consider behave arbitrarily disobeying the consensus protocol. We propose a trust-based consensus algorithm with local and global trust evaluations. The algorithm can be abstracted using a two-layer structure with the top layer running a trust-based consensus algorithm and the bottom layer as a subroutine executing a global trust update scheme. We utilize a set of pre-trusted nodes, headers, to propagate local trust opinions throughout the network. This two-layer framework is flexible in that it can be easily extensible to contain more complicated decision rules, and global trust schemes. The first problem assumes that normal nodes are homogeneous, i.e. it is guaranteed that a normal node always behaves as it is programmed. In the second and third problems however, we assume that nodes are heterogeneous, i.e, given a task, the probability that a node generates a correct answer varies from node to node. The adversaries considered in these two problems are workers from the open crowd who are either investing little efforts in the tasks assigned to them or intentionally give wrong answers to questions. In the second part of the thesis, we consider a typical crowdsourcing task that aggregates input from multiple workers as a problem in information fusion. To cope with the issue of noisy and sometimes malicious input from workers, trust is used to model workers' expertise. In a multi-domain knowledge learning task, however, using scalar-valued trust to model a worker's performance is not sufficient to reflect the worker's trustworthiness in each of the domains. To address this issue, we propose a probabilistic model to jointly infer multi-dimensional trust of workers, multi-domain properties of questions, and true labels of questions. Our model is very flexible and extensible to incorporate metadata associated with questions. To show that, we further propose two extended models, one of which handles input tasks with real-valued features and the other handles tasks with text features by incorporating topic models. Our models can effectively recover trust vectors of workers, which can be very useful in task assignment adaptive to workers' trust in the future. These results can be applied for fusion of information from multiple data sources like sensors, human input, machine learning results, or a hybrid of them. In the second subproblem, we address crowdsourcing with adversaries under logical constraints. We observe that questions are often not independent in real life applications. Instead, there are logical relations between them. Similarly, workers that provide answers are not independent of each other either. Answers given by workers with similar attributes tend to be correlated. Therefore, we propose a novel unified graphical model consisting of two layers. The top layer encodes domain knowledge which allows users to express logical relations using first-order logic rules and the bottom layer encodes a traditional crowdsourcing graphical model. Our model can be seen as a generalized probabilistic soft logic framework that encodes both logical relations and probabilistic dependencies. To solve the collective inference problem efficiently, we have devised a scalable joint inference algorithm based on the alternating direction method of multipliers. The third part of the thesis considers the problem of optimal assignment under budget constraints when workers are unreliable and sometimes malicious. In a real crowdsourcing market, each answer obtained from a worker incurs cost. The cost is associated with both the level of trustworthiness of workers and the difficulty of tasks. Typically, access to expert-level (more trustworthy) workers is more expensive than to average crowd and completion of a challenging task is more costly than a click-away question. In this problem, we address the problem of optimal assignment of heterogeneous tasks to workers of varying trust levels with budget constraints. Specifically, we design a trust-aware task allocation algorithm that takes as inputs the estimated trust of workers and pre-set budget, and outputs the optimal assignment of tasks to workers. We derive the bound of total error probability that relates to budget, trustworthiness of crowds, and costs of obtaining labels from crowds naturally. Higher budget, more trustworthy crowds, and less costly jobs result in a lower theoretical bound. Our allocation scheme does not depend on the specific design of the trust evaluation component. Therefore, it can be combined with generic trust evaluation algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports a case study in the use of proof planning in the context of higher order syntax. Rippling is a heuristic for guiding rewriting steps in induction that has been used successfully in proof planning inductive proofs using first order representations. Ordinal arithmetic provides a natural set of higher order examples on which transfinite induction may be attempted using rippling. Previously Boyer-Moore style automation could not be applied to such domains. We demonstrate that a higher-order extension of the rippling heuristic is sufficient to plan such proofs automatically. Accordingly, ordinal arithmetic has been implemented in lambda-clam, a higher order proof planning system for induction, and standard undergraduate text book problems have been successfully planned. We show the synthesis of a fixpoint for normal ordinal functions which demonstrates how our automation could be extended to produce more interesting results than the textbook examples tried so far.