806 resultados para Difference logic
Resumo:
Due to the growing interest in social networks, link prediction has received significant attention. Link prediction is mostly based on graph-based features, with some recent approaches focusing on domain semantics. We propose algorithms for link prediction that use a probabilistic ontology to enhance the analysis of the domain and the unavoidable uncertainty in the task (the ontology is specified in the probabilistic description logic crALC). The scalability of the approach is investigated, through a combination of semantic assumptions and graph-based features. We evaluate empirically our proposal, and compare it with standard solutions in the literature.
Resumo:
The main objective of this work is to present an efficient method for phasor estimation based on a compact Genetic Algorithm (cGA) implemented in Field Programmable Gate Array (FPGA). To validate the proposed method, an Electrical Power System (EPS) simulated by the Alternative Transients Program (ATP) provides data to be used by the cGA. This data is as close as possible to the actual data provided by the EPS. Real life situations such as islanding, sudden load increase and permanent faults were considered. The implementation aims to take advantage of the inherent parallelism in Genetic Algorithms in a compact and optimized way, making them an attractive option for practical applications in real-time estimations concerning Phasor Measurement Units (PMUs).
Resumo:
A complete laser cooling setup was built, with focus on threedimensional near-resonant optical lattices for cesium. These consist of regularly ordered micropotentials, created by the interference of four laser beams. One key feature of optical lattices is an inherent ”Sisyphus cooling” process. It efficiently extracts kinetic energy from the atoms, leading to equilibrium temperatures of a few µK. The corresponding kinetic energy is lower than the depth of the potential wells, so that atoms can be trapped. We performed detailed studies of the cooling processes in optical lattices by using the time-of-flight and absorption-imaging techniques. We investigated the dependence of the equilibrium temperature on the optical lattice parameters, such as detuning, optical potential and lattice geometry. The presence of neighbouring transitions in the cesium hyperfine level structure was used to break symmetries in order to identify, which role “red” and “blue” transitions play in the cooling. We also examined the limits for the cooling process in optical lattices, and the possible difference in steady-state velocity distributions for different directions. Moreover, in collaboration with ´Ecole Normale Sup´erieure in Paris, numerical simulations were performed in order to get more insight in the cooling dynamics of optical lattices. Optical lattices can keep atoms almost perfectly isolated from the environment and have therefore been suggested as a platform for a host of possible experiments aimed at coherent quantum manipulations, such as spin-squeezing and the implementation of quantum logic-gates. We developed a novel way to trap two different cesium ground states in two distinct, interpenetrating optical lattices, and to change the distance between sites of one lattice relative to sites of the other lattice. This is a first step towards the implementation of quantum simulation schemes in optical lattices.
Resumo:
Stone Age research on Northern Europe frequently makes gross generalizations about the Mesolithic and Neolithic, although we still lack much basic knowledge on how the people lived. The transition from the Mesolithic to the Neolithic in Europe has been described as a radical shift from an economy dominated by marine resources to one solely dependent on farming. Both the occurrence and the geographical extent of such a drastic shift can be questioned, however. It is therefore important to start out at a more detailed level of evidence in order to present the overall picture, and to account for the variability even in such regional or chronological overviews. Fifteen Stone Age sites were included in this study, ranging chronologically from the Early Mesolithic to the Middle or Late Neolithic, c. 8300–2500 BC, and stretching geographically from the westernmost coast of Sweden to the easternmost part of Latvia within the confines of latitudes 55–59° N. The most prominent sites in terms of the number of human and faunal samples analysed are Zvejnieki, Västerbjers and Skateholm I–II. Human and faunal skeletal remains were subjected to stable carbon and nitrogen isotope analysis to study diet and ecology at the sites. Stable isotope analyses of human remains provide quantitative information on the relative importance of various food sources, an important addition to the qualitative data supplied by certain artefacts and structures or by faunal or botanical remains. A vast number of new radiocarbon dates were also obtained. In conclusion, a rich diversity in Stone Age dietary practice in the Baltic Region was demonstrated. Evidence ranging from the Early Mesolithic to the Late Neolithic show that neither chronology nor location alone can account for this variety, but that there are inevitably cultural factors as well. Food habits are culturally governed, and therefore we cannot automatically assume that people at similar sites will have the same diet. Stable isotope studies are very important here, since they tell us what people actually consumed, not only what was available, or what one single meal contained. We should not be deceived in inferring diet from ritually deposited remains, since things that were mentally important were not always important in daily life. Thus, although a ritual and symbolic norm may emphasize certain food categories, these may in fact contribute very little to the diet. By the progress of analysis of intra-individual variation, new data on life history changes have been produced, revealing mobility patterns, breastfeeding behaviour and certain dietary transitions. The inclusion of faunal data has proved invaluable for understanding the stable isotope ecology of a site, and thereby improve the precision of the interpretations of human stable isotope data. The special case of dogs, though, demonstrates that these animals are not useful for inferring human diet, since, due to the number of roles they possess in human society, dogs could deviate significantly from humans in their diet, and in several cases have been proved to do so. When evaluating radiocarbon data derived from human and animal remains from the Pitted-Ware site of Västerbjers on Gotland, the importance of establishing the stable isotope ecology of the site before making deductions on reservoir effects was further demonstrated. The main aim of this thesis has been to demonstrate the variation and diversity in human practices, challenging the view of a “monolithic” Stone Age. By looking at individuals and not only at populations, the whole range of human behaviour has been accounted for, also revealing discrepancies between norm and practice, which are frequently visible both in the archaeological record and in present-day human behaviour.
Resumo:
Interaction protocols establish how different computational entities can interact with each other. The interaction can be finalized to the exchange of data, as in 'communication protocols', or can be oriented to achieve some result, as in 'application protocols'. Moreover, with the increasing complexity of modern distributed systems, protocols are used also to control such a complexity, and to ensure that the system as a whole evolves with certain features. However, the extensive use of protocols has raised some issues, from the language for specifying them to the several verification aspects. Computational Logic provides models, languages and tools that can be effectively adopted to address such issues: its declarative nature can be exploited for a protocol specification language, while its operational counterpart can be used to reason upon such specifications. In this thesis we propose a proof-theoretic framework, called SCIFF, together with its extensions. SCIFF is based on Abductive Logic Programming, and provides a formal specification language with a clear declarative semantics (based on abduction). The operational counterpart is given by a proof procedure, that allows to reason upon the specifications and to test the conformance of given interactions w.r.t. a defined protocol. Moreover, by suitably adapting the SCIFF Framework, we propose solutions for addressing (1) the protocol properties verification (g-SCIFF Framework), and (2) the a-priori conformance verification of peers w.r.t. the given protocol (AlLoWS Framework). We introduce also an agent based architecture, the SCIFF Agent Platform, where the same protocol specification can be used to program and to ease the implementation task of the interacting peers.
Resumo:
Sustainable computer systems require some flexibility to adapt to environmental unpredictable changes. A solution lies in autonomous software agents which can adapt autonomously to their environments. Though autonomy allows agents to decide which behavior to adopt, a disadvantage is a lack of control, and as a side effect even untrustworthiness: we want to keep some control over such autonomous agents. How to control autonomous agents while respecting their autonomy? A solution is to regulate agents’ behavior by norms. The normative paradigm makes it possible to control autonomous agents while respecting their autonomy, limiting untrustworthiness and augmenting system compliance. It can also facilitate the design of the system, for example, by regulating the coordination among agents. However, an autonomous agent will follow norms or violate them in some conditions. What are the conditions in which a norm is binding upon an agent? While autonomy is regarded as the driving force behind the normative paradigm, cognitive agents provide a basis for modeling the bindingness of norms. In order to cope with the complexity of the modeling of cognitive agents and normative bindingness, we adopt an intentional stance. Since agents are embedded into a dynamic environment, things may not pass at the same instant. Accordingly, our cognitive model is extended to account for some temporal aspects. Special attention is given to the temporal peculiarities of the legal domain such as, among others, the time in force and the time in efficacy of provisions. Some types of normative modifications are also discussed in the framework. It is noteworthy that our temporal account of legal reasoning is integrated to our commonsense temporal account of cognition. As our intention is to build sustainable reasoning systems running unpredictable environment, we adopt a declarative representation of knowledge. A declarative representation of norms will make it easier to update their system representation, thus facilitating system maintenance; and to improve system transparency, thus easing system governance. Since agents are bounded and are embedded into unpredictable environments, and since conflicts may appear amongst mental states and norms, agent reasoning has to be defeasible, i.e. new pieces of information can invalidate formerly derivable conclusions. In this dissertation, our model is formalized into a non-monotonic logic, namely into a temporal modal defeasible logic, in order to account for the interactions between normative systems and software cognitive agents.
Resumo:
Several activities were conducted during my PhD activity. For the NEMO experiment a collaboration between the INFN/University groups of Catania and Bologna led to the development and production of a mixed signal acquisition board for the Nemo Km3 telescope. The research concerned the feasibility study for a different acquisition technique quite far from that adopted in the NEMO Phase 1 telescope. The DAQ board that we realized exploits the LIRA06 front-end chip for the analog acquisition of anodic an dynodic sources of a PMT (Photo-Multiplier Tube). The low-power analog acquisition allows to sample contemporaneously multiple channels of the PMT at different gain factors in order to increase the signal response linearity over a wider dynamic range. Also the auto triggering and self-event-classification features help to improve the acquisition performance and the knowledge on the neutrino event. A fully functional interface towards the first level data concentrator, the Floor Control Module, has been integrated as well on the board, and a specific firmware has been realized to comply with the present communication protocols. This stage of the project foresees the use of an FPGA, a high speed configurable device, to provide the board with a flexible digital logic control core. After the validation of the whole front-end architecture this feature would be probably integrated in a common mixed-signal ASIC (Application Specific Integrated Circuit). The volatile nature of the configuration memory of the FPGA implied the integration of a flash ISP (In System Programming) memory and a smart architecture for a safe remote reconfiguration of it. All the integrated features of the board have been tested. At the Catania laboratory the behavior of the LIRA chip has been investigated in the digital environment of the DAQ board and we succeeded in driving the acquisition with the FPGA. The PMT pulses generated with an arbitrary waveform generator were correctly triggered and acquired by the analog chip, and successively they were digitized by the on board ADC under the supervision of the FPGA. For the communication towards the data concentrator a test bench has been realized in Bologna where, thanks to a lending of the Roma University and INFN, a full readout chain equivalent to that present in the NEMO phase-1 was installed. These tests showed a good behavior of the digital electronic that was able to receive and to execute command imparted by the PC console and to answer back with a reply. The remotely configurable logic behaved well too and demonstrated, at least in principle, the validity of this technique. A new prototype board is now under development at the Catania laboratory as an evolution of the one described above. This board is going to be deployed within the NEMO Phase-2 tower in one of its floors dedicated to new front-end proposals. This board will integrate a new analog acquisition chip called SAS (Smart Auto-triggering Sampler) introducing thus a new analog front-end but inheriting most of the digital logic present in the current DAQ board discussed in this thesis. For what concern the activity on high-resolution vertex detectors, I worked within the SLIM5 collaboration for the characterization of a MAPS (Monolithic Active Pixel Sensor) device called APSEL-4D. The mentioned chip is a matrix of 4096 active pixel sensors with deep N-well implantations meant for charge collection and to shield the analog electronics from digital noise. The chip integrates the full-custom sensors matrix and the sparsifification/readout logic realized with standard-cells in STM CMOS technology 130 nm. For the chip characterization a test-beam has been set up on the 12 GeV PS (Proton Synchrotron) line facility at CERN of Geneva (CH). The collaboration prepared a silicon strip telescope and a DAQ system (hardware and software) for data acquisition and control of the telescope that allowed to store about 90 million events in 7 equivalent days of live-time of the beam. My activities concerned basically the realization of a firmware interface towards and from the MAPS chip in order to integrate it on the general DAQ system. Thereafter I worked on the DAQ software to implement on it a proper Slow Control interface of the APSEL4D. Several APSEL4D chips with different thinning have been tested during the test beam. Those with 100 and 300 um presented an overall efficiency of about 90% imparting a threshold of 450 electrons. The test-beam allowed to estimate also the resolution of the pixel sensor providing good results consistent with the pitch/sqrt(12) formula. The MAPS intrinsic resolution has been extracted from the width of the residual plot taking into account the multiple scattering effect.
Resumo:
The advent of distributed and heterogeneous systems has laid the foundation for the birth of new architectural paradigms, in which many separated and autonomous entities collaborate and interact to the aim of achieving complex strategic goals, impossible to be accomplished on their own. A non exhaustive list of systems targeted by such paradigms includes Business Process Management, Clinical Guidelines and Careflow Protocols, Service-Oriented and Multi-Agent Systems. It is largely recognized that engineering these systems requires novel modeling techniques. In particular, many authors are claiming that an open, declarative perspective is needed to complement the closed, procedural nature of the state of the art specification languages. For example, the ConDec language has been recently proposed to target the declarative and open specification of Business Processes, overcoming the over-specification and over-constraining issues of classical procedural approaches. On the one hand, the success of such novel modeling languages strongly depends on their usability by non-IT savvy: they must provide an appealing, intuitive graphical front-end. On the other hand, they must be prone to verification, in order to guarantee the trustworthiness and reliability of the developed model, as well as to ensure that the actual executions of the system effectively comply with it. In this dissertation, we claim that Computational Logic is a suitable framework for dealing with the specification, verification, execution, monitoring and analysis of these systems. We propose to adopt an extended version of the ConDec language for specifying interaction models with a declarative, open flavor. We show how all the (extended) ConDec constructs can be automatically translated to the CLIMB Computational Logic-based language, and illustrate how its corresponding reasoning techniques can be successfully exploited to provide support and verification capabilities along the whole life cycle of the targeted systems.
Resumo:
The focus of this dissertation is the relationship between the necessity for protection and the construction of cultural identities. In particular, by cultural identities I mean the representation and construction of communities: national communities, religious communities or local communities. By protection I mean the need for individuals and groups to be reassured about dangers and risks. From an anthropological point of view, the relationship between the need for protection and the formation and construction of collective identities is driven by the defensive function of culture. This was recognized explicitly by Claude Lévi-Strauss and Jurij Lotman. To explore the “protective hypothesis,” it was especially useful to compare the immunitarian paradigm, proposed by Roberto Esposito, with a semiotic approach to the problem. According to Esposito, immunity traces borders, dividing Community from what should be kept outside: the enemies, dangers and chaos, and, in general, whatever is perceived to be a threat to collective and individual life. I recognized two dimensions in the concept of immunity. The first is the logic dimension: every element of a system makes sense because of the network of differential relations in which it is inscribed; the second dimension is the social praxis of division and definition of who. We are (or what is inside the border), and who They are (or what is, and must be kept, outside the border). I tested my hypothesis by analyzing two subject areas in particular: first, the security practices in London after 9/11 and 7/7; and, second, the Spiritual Guide of 9/11 suicide bombers. In both cases, one observes the construction of two entities: We and They. The difference between the two cases is their “model of the world”: in the London case, one finds the political paradigms of security as Sovereignty, Governamentality and Biopolitics. In the Spiritual Guide, one observes a religious model of the Community of God confronting the Community of Evil. From a semiotic point view, the problem is the origin of respective values, the origin of respective moral universes, and the construction of authority. In both cases, I found that emotional dynamics are crucial in the process of forming collective identities and in the process of motivating the involved subjects: specifically, the role of fear and terror is the primary factor, and represents the principal focus of my research.
Resumo:
Human reasoning is a fascinating and complex cognitive process that can be applied in different research areas such as philosophy, psychology, laws and financial. Unfortunately, developing supporting software (to those different areas) able to cope such as complex reasoning it’s difficult and requires a suitable logic abstract formalism. In this thesis we aim to develop a program, that has the job to evaluate a theory (a set of rules) w.r.t. a Goal, and provide some results such as “The Goal is derivable from the KB5 (of the theory)”. In order to achieve this goal we need to analyse different logics and choose the one that best meets our needs. In logic, usually, we try to determine if a given conclusion is logically implied by a set of assumptions T (theory). However, when we deal with programming logic we need an efficient algorithm in order to find such implications. In this work we use a logic rather similar to human logic. Indeed, human reasoning requires an extension of the first order logic able to reach a conclusion depending on not definitely true6 premises belonging to a incomplete set of knowledge. Thus, we implemented a defeasible logic7 framework able to manipulate defeasible rules. Defeasible logic is a non-monotonic logic designed for efficient defeasible reasoning by Nute (see Chapter 2). Those kind of applications are useful in laws area especially if they offer an implementation of an argumentation framework that provides a formal modelling of game. Roughly speaking, let the theory is the set of laws, a keyclaim is the conclusion that one of the party wants to prove (and the other one wants to defeat) and adding dynamic assertion of rules, namely, facts putted forward by the parties, then, we can play an argumentative challenge between two players and decide if the conclusion is provable or not depending on the different strategies performed by the players. Implementing a game model requires one more meta-interpreter able to evaluate the defeasible logic framework; indeed, according to Göedel theorem (see on page 127), we cannot evaluate the meaning of a language using the tools provided by the language itself, but we need a meta-language able to manipulate the object language8. Thus, rather than a simple meta-interpreter, we propose a Meta-level containing different Meta-evaluators. The former has been explained above, the second one is needed to perform the game model, and the last one will be used to change game execution and tree derivation strategies.
Resumo:
Why some powers manage to coordinate their security efforts while others confront each other as rivals is still one of the most relevant and debated questions in the field of IR theory. The dissertation wants to give a contribution to this important debate. In particular, the main goal of the research is to analyse the dynamics of great power interactions after the end of hegemonic conflicts, that is to understand why, following the defeat of the common enemies, some of the winning allies continue to cooperate, while others begin to engage in political and military competition. In order to understand this difference, the study compares the explanatory value of two rival theoretical perspectives: neorealism, in its main version of the balance of power framework, and a liberal approach focused on domestic politics. The thesis is divided in two sections. In the first, I do summarize the main assumptions and predictions of the theories, from which I derive two different sets of hypotheses on the evolution of post-war great power relations. In the second part, I test the hypotheses by focusing on two cases of post-war alignment dynamics: 1) the relations among Austria, Prussia, Russia, Great Britain and France after the Napoleonic wars; 2) the relations among the US, the UK, France and Italy after the end of WWI. The historical cases disconfirm the logic of the balance of power and confirm the liberal hypotheses, seeing that the results of the analysis show changes in the domestic structures of the great powers had a much larger impact on the emergence of new alliances and rivalries than did the international distribution of power. In the conclusion of the dissertation, I provide the reader with a discussion of the main theoretical implications of the empirical findings.
Resumo:
Implicazioni tettoniche ed estetiche delle logiche monoscocca integrate e stress lines analysis in architettura.