985 resultados para Many-valued logic


Relevância:

30.00% 30.00%

Publicador:

Resumo:

I will argue that the doctrine of eternal recurrence of the same no better interprets cosmology than pink elephants interpret zoology. I will also argue that the eternal-reiurn-of-the-same doctrine as what Magnus calls "existential imperative" is without possibility of application and thus futile. To facilitate those arguments, the validity of the doctrine of the eternal recurrence of the same will be tested under distinct rubrics. Although each rubric will stand alone, one per chapter, as an evaluation of some specific aspect of eternal recurrence, the rubric sequence has been selected to accommodate the identification of what I shall be calling logic abridgments. The conclusions to be extracted from each rubric are grouped under the heading CONCLUSION and appear immediately following rubric ten. Then, or if, at the end of a rubric a reader is inclined to wonder which rubric or topic is next, and why, the answer can be found at the top of the following page. The question is usually answered in the very first sentence, but always answered in the first paragraph. The first rubric has been placed in order by chronological entitlement in that it deals with the evolution of the idea of eternal recurrence from the time of the ancient Greeks to Nietzsche's August, 1881 inspiration. This much-recommended technique is also known as starting at the beginning. Rubric 1 also deals with 20th. Century philosophers' assessments of the relationship between Nietzsche and ancient Greek thought. The only experience of E-R, Zarathustra's mountain vision, is second only because it sets the scene alluded to in following rubrics. The third rubric explores .ii?.ih T jc,i -I'w Nietzsche's evaluation of rationality so that his thought processes will be understood appropriately. The actual mechanism of E-R is tested in rubric four...The scientific proof Nietzsche assembled in support of E-R is assessed by contemporary philosophers in rubric five. E-R's function as an ethical imperative is debated in rubrics six and seven.. .The extent to which E-R fulfills its purpose in overcoming nihilism is measured against the comfort assured by major world religions in rubric eight. Whether E-R also serves as a redemption for revenge is questioned in rubric nine. Rubric ten assures that E-R refers to return of the identically same and not merely the similar. In addition to assemblage and evaluation of all ten rubrics, at the end of each rubric a brief recapitulation of its principal points concludes the chapter. In this essay I will assess the theoretical conditions under which the doctrine cannot be applicable and will show what contradictions and inconsistencies follow if the doctrine is taken to be operable. Harold Alderman in his book Nietzsche's Gift wrote, the "doctrine of eternal recurrence gives us a problem not in Platonic cosmology, but in Socratic selfreflection." ^ I will illustrate that the recurrence doctrine's cosmogony is unworkable and that if it were workable, it would negate self-reflection on the grounds that selfreflection cannot find its cause in eternal recurrence of the same. Thus, when the cosmology is shown to be impossible, any expected ensuing results or benefits will be rendered also impossible. The so-called "heaviest burden" will be exposed as complex, engrossing "what if speculations deserving no linkings to reality. To identify ^Alderman p. 84 abridgments of logic, contradictions and inconsistencies in Nietzsche's doctrine of eternal recurrence of the same, I. will examine the subject under the following schedule. In Chapter 1 the ancient origins of recurrence theories will be introduced. ..This chapter is intended to establish the boundaries within which the subsequent chapters, except Chapter 10, will be confined. Chapter 2, Zarathustra's vision of E-R, assesses the sections of Thus Spoke Zarathustra in which the phenomenon of recurrence of the same is reported. ..Nihilism as a psychological difficulty is introduced in this rubric, but that subject will be studied in detail in Chapter 8. In Chapter 2 the symbols of eternal recurrence of the same will be considered. Whether the recurrence image should be of a closed ring or as a coil will be of significance in many sections of my essay. I will argue that neither symbolic configuration can accommodate Nietzsche's supposed intention. Chapter 3 defends the description of E-R given by Zarathustra. Chapter 4, the cosmological mechanics of E-R, speculates on the seriousness with which Nietzsche might have intended the doctrine of eternal recurrence to be taken. My essay reports, and then assesses, the argument of those who suppose the doctrine to have been merely exploratory musings by Nietzsche on cosmological hypotheses...The cosmogony of E-R is examined. In Chapter 5, cosmological proofs tested, the proofs for Nietzsche's doctrine of return of the same are evaluated. This chapter features the position taken by Martin ' Heidegger. My essay suggests that while Heidegger's argument that recurrence of the same is a genuine cosmic agenda is admirable, it is not at all persuasive. Chapter 6, E-R is an ethical imperative, is in essence the reporting of a debate between two scholars regarding the possibility of an imperative in the doctrine of recurrence. Their debate polarizes the arguments I intend to develop. Chapter 7, does E-R of the same preclude alteration of attitudes, is a continuation of the debate presented in Chapter 6 with the focus shifted to the psychological from the cosmological aspects of eternal recurrence of the same. Chapter 8, Can E-R Overcome Nihilism?, is divided into two parts. In the first, nihilism as it applies to Nietzsche's theory is discussed. ..In part 2, the broader consequences, sources and definitions of nihilism are outlined. My essay argues that Nietzsche's doctrine is more nihilistic than are the world's major religions. Chapter 9, Is E-R a redemption for revenge?, examines the suggestion extracted from Thus Spoke Zarathustra that the doctrine of eternal recurrence is intended, among other purposes, as a redemption for mankind from the destructiveness of revenge. Chapter 10, E-R of the similar refuted, analyses a position that an element of chance can influence the doctrine of recurrence. This view appears to allow, not for recurrence of the same, but recurrence of the similar. A summary will recount briefly the various significant logic abridgments, contradictions, and inconsistencies associated with Nietzsche's doctrine of eternal recurrence of the same. In the 'conclusion' section of my essay my own opinions and observations will be assembled from the body of the essay.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider general allocation problems with indivisibilities where agents' preferences possibly exhibit externalities. In such contexts many different core notions were proposed. One is the gamma-core whereby blocking is only allowed via allocations where the non-blocking agents receive their endowment. We show that if there exists an allocation rule satisfying ‘individual rationality’, ‘efficiency’, and ‘strategy-proofness’, then for any problem for which the gamma-core is non-empty, the allocation rule must choose a gamma-core allocation and all agents are indifferent between all allocations in the gamma-core. We apply our result to housing markets, coalition formation and networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The underlying assumptions for interpreting the meaning of data often change over time, which further complicates the problem of semantic heterogeneities among autonomous data sources. As an extension to the COntext INterchange (COIN) framework, this paper introduces the notion of temporal context as a formalization of the problem. We represent temporal context as a multi-valued method in F-Logic; however, only one value is valid at any point in time, the determination of which is constrained by temporal relations. This representation is then mapped to an abductive constraint logic programming framework with temporal relations being treated as constraints. A mediation engine that implements the framework automatically detects and reconciles semantic differences at different times. We articulate that this extended COIN framework is suitable for reasoning on the Semantic Web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The underlying assumptions for interpreting the meaning of data often change over time, which further complicates the problem of semantic heterogeneities among autonomous data sources. As an extension to the COntext INterchange (COIN) framework, this paper introduces the notion of temporal context as a formalization of the problem. We represent temporal context as a multi-valued method in F-Logic; however, only one value is valid at any point in time, the determination of which is constrained by temporal relations. This representation is then mapped to an abductive constraint logic programming framework with temporal relations being treated as constraints. A mediation engine that implements the framework automatically detects and reconciles semantic differences at different times. We articulate that this extended COIN framework is suitable for reasoning on the Semantic Web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The underlying assumptions for interpreting the meaning of data often change over time, which further complicates the problem of semantic heterogeneities among autonomous data sources. As an extension to the COntext INterchange (COIN) framework, this paper introduces the notion of temporal context as a formalization of the problem. We represent temporal context as a multi-valued method in F-Logic; however, only one value is valid at any point in time, the determination of which is constrained by temporal relations. This representation is then mapped to an abductive constraint logic programming framework with temporal relations being treated as constraints. A mediation engine that implements the framework automatically detects and reconciles semantic differences at different times. We articulate that this extended COIN framework is suitable for reasoning on the Semantic Web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The underlying assumptions for interpreting the meaning of data often change over time, which further complicates the problem of semantic heterogeneities among autonomous data sources. As an extension to the COntext INterchange (COIN) framework, this paper introduces the notion of temporal context as a formalization of the problem. We represent temporal context as a multi-valued method in F-Logic; however, only one value is valid at any point in time, the determination of which is constrained by temporal relations. This representation is then mapped to an abductive constraint logic programming framework with temporal relations being treated as constraints. A mediation engine that implements the framework automatically detects and reconciles semantic differences at different times. We articulate that this extended COIN framework is suitable for reasoning on the Semantic Web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many communication signal processing applications involve modelling and inverting complex-valued (CV) Hammerstein systems. We develops a new CV B-spline neural network approach for efficient identification of the CV Hammerstein system and effective inversion of the estimated CV Hammerstein model. Specifically, the CV nonlinear static function in the Hammerstein system is represented using the tensor product from two univariate B-spline neural networks. An efficient alternating least squares estimation method is adopted for identifying the CV linear dynamic model’s coefficients and the CV B-spline neural network’s weights, which yields the closed-form solutions for both the linear dynamic model’s coefficients and the B-spline neural network’s weights, and this estimation process is guaranteed to converge very fast to a unique minimum solution. Furthermore, an accurate inversion of the CV Hammerstein system can readily be obtained using the estimated model. In particular, the inversion of the CV nonlinear static function in the Hammerstein system can be calculated effectively using a Gaussian-Newton algorithm, which naturally incorporates the efficient De Boor algorithm with both the B-spline curve and first order derivative recursions. The effectiveness of our approach is demonstrated using the application to equalisation of Hammerstein channels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Within many Anglophone nation states there is significant debate about
the future of public education and its ongoing capacity to provide quality
education. The new knowledge economy not only challenges the position
of educators as the primary producers, disseminators and authorizers of
what is valued knowledge, but also requires them to prepare students for
new ways of working with that knowledge. In the service economies of
post-industrial Western nations, 'knowledge work' is critical to national
productivity and international competitiveness. At the same time, the
globalization logic suggests that the nation state is under threat, and therefore its role as provider of universal services such as education is also threatened.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How to provide cost-effective strategies for Software Testing has been one of the research focuses in Software Engineering for a long time. Many researchers in Software Engineering have addressed the effectiveness and quality metric of Software Testing, and many interesting results have been obtained. However, one issue of paramount importance in software testing – the intrinsic imprecise and uncertain relationships within testing metrics – is left unaddressed. To this end, a new quality and effectiveness measurement based on fuzzy logic is proposed. The software quality features and analogy-based reasoning are discussed, which can deal with quality and effectiveness consistency between different test projects. Experimental results are also provided to verify the proposed measurement.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How to provide cost-effective strategies for Software Testing has been one of the research focuses in Software Engineering for a long time. Many researchers in Software Engineering have addressed the effectiveness and quality metric of Software Testing, and many interesting results have been obtained. However, one issue of paramount importance in software testing — the intrinsic imprecise and uncertain relationships within testing metrics — is left unaddressed. To this end, a new quality and effectiveness measurement based on fuzzy logic is proposed. Related issues like the software quality features and fuzzy reasoning for test project similarity measurement are discussed, which can deal with quality and effectiveness consistency between different test projects. Experiments were conducted to verify the proposed measurement using real data from actual software testing projects. Experimental results show that the proposed fuzzy logic based metrics is effective and efficient to measure and evaluate the quality and effectiveness of test projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Headteacher workloads are often in the news. Long hours, punitive audit regimes and excessive amounts of paperwork take their toll on many, including John Illingworth, former National Union of Teachers (UK) President, and ex primary headteacher. In this paper, I investigate a UK BBC Radio 4 human interest interview conducted with Illingworth by the usually acerbic John Humphrys. Mobilising Bourdieu’s notion of field, I examine the interview and argue that the analysis suggests that the media game of market share and the doxa of the fourth estate might work to delimit the capacity of such interviews to speak truth to policy power.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advent of distributed and heterogeneous systems has laid the foundation for the birth of new architectural paradigms, in which many separated and autonomous entities collaborate and interact to the aim of achieving complex strategic goals, impossible to be accomplished on their own. A non exhaustive list of systems targeted by such paradigms includes Business Process Management, Clinical Guidelines and Careflow Protocols, Service-Oriented and Multi-Agent Systems. It is largely recognized that engineering these systems requires novel modeling techniques. In particular, many authors are claiming that an open, declarative perspective is needed to complement the closed, procedural nature of the state of the art specification languages. For example, the ConDec language has been recently proposed to target the declarative and open specification of Business Processes, overcoming the over-specification and over-constraining issues of classical procedural approaches. On the one hand, the success of such novel modeling languages strongly depends on their usability by non-IT savvy: they must provide an appealing, intuitive graphical front-end. On the other hand, they must be prone to verification, in order to guarantee the trustworthiness and reliability of the developed model, as well as to ensure that the actual executions of the system effectively comply with it. In this dissertation, we claim that Computational Logic is a suitable framework for dealing with the specification, verification, execution, monitoring and analysis of these systems. We propose to adopt an extended version of the ConDec language for specifying interaction models with a declarative, open flavor. We show how all the (extended) ConDec constructs can be automatically translated to the CLIMB Computational Logic-based language, and illustrate how its corresponding reasoning techniques can be successfully exploited to provide support and verification capabilities along the whole life cycle of the targeted systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.