10 resultados para Impossibility
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Interactive theorem provers (ITP for short) are tools whose final aim is to certify proofs written by human beings. To reach that objective they have to fill the gap between the high level language used by humans for communicating and reasoning about mathematics and the lower level language that a machine is able to “understand” and process. The user perceives this gap in terms of missing features or inefficiencies. The developer tries to accommodate the user requests without increasing the already high complexity of these applications. We believe that satisfactory solutions can only come from a strong synergy between users and developers. We devoted most part of our PHD designing and developing the Matita interactive theorem prover. The software was born in the computer science department of the University of Bologna as the result of composing together all the technologies developed by the HELM team (to which we belong) for the MoWGLI project. The MoWGLI project aimed at giving accessibility through the web to the libraries of formalised mathematics of various interactive theorem provers, taking Coq as the main test case. The motivations for giving life to a new ITP are: • study the architecture of these tools, with the aim of understanding the source of their complexity • exploit such a knowledge to experiment new solutions that, for backward compatibility reasons, would be hard (if not impossible) to test on a widely used system like Coq. Matita is based on the Curry-Howard isomorphism, adopting the Calculus of Inductive Constructions (CIC) as its logical foundation. Proof objects are thus, at some extent, compatible with the ones produced with the Coq ITP, that is itself able to import and process the ones generated using Matita. Although the systems have a lot in common, they share no code at all, and even most of the algorithmic solutions are different. The thesis is composed of two parts where we respectively describe our experience as a user and a developer of interactive provers. In particular, the first part is based on two different formalisation experiences: • our internship in the Mathematical Components team (INRIA), that is formalising the finite group theory required to attack the Feit Thompson Theorem. To tackle this result, giving an effective classification of finite groups of odd order, the team adopts the SSReflect Coq extension, developed by Georges Gonthier for the proof of the four colours theorem. • our collaboration at the D.A.M.A. Project, whose goal is the formalisation of abstract measure theory in Matita leading to a constructive proof of Lebesgue’s Dominated Convergence Theorem. The most notable issues we faced, analysed in this part of the thesis, are the following: the difficulties arising when using “black box” automation in large formalisations; the impossibility for a user (especially a newcomer) to master the context of a library of already formalised results; the uncomfortable big step execution of proof commands historically adopted in ITPs; the difficult encoding of mathematical structures with a notion of inheritance in a type theory without subtyping like CIC. In the second part of the manuscript many of these issues will be analysed with the looking glasses of an ITP developer, describing the solutions we adopted in the implementation of Matita to solve these problems: integrated searching facilities to assist the user in handling large libraries of formalised results; a small step execution semantic for proof commands; a flexible implementation of coercive subtyping allowing multiple inheritance with shared substructures; automatic tactics, integrated with the searching facilities, that generates proof commands (and not only proof objects, usually kept hidden to the user) one of which specifically designed to be user driven.
Resumo:
This research argues for an analysis of textual and cultural forms in the American horror film (1968- 1998), by defining the so-called postmodern characters. The “postmodern” term will not mean a period of the history of cinema, but a series of forms and strategies recognizable in many American films. From a bipolar re-mediation and cognitive point of view, the postmodern phenomenon is been considered as a formal and epistemological re-configuration of the cultural “modern” system. The first section of the work examines theoretical problems about the “postmodern phenomenon” by defining its cultural and formal constants in different areas (epistemology, economy, mass-media): the character of convergence, fragmentation, manipulation and immersion represent the first ones, while the “excess” is the morphology of the change, by realizing the “fluctuation” of the previous consolidated system. The second section classifies the textual and cultural forms of American postmodern film, generally non-horror. The “classic narrative” structure – coherent and consequent chain of causal cues toward a conclusion – is scattered by the postmodern constant of “fragmentation”. New textual models arise, fragmenting the narrative ones into the aggregations of data without causal-temporal logics. Considering the process of “transcoding”1 and “remediation”2 between media, and the principle of “convergence” in the phenomenon, the essay aims to define these structures in postmodern film as “database forms” and “navigable space forms.” The third section applies this classification to American horror film (1968-1998). The formal constant of “excess” in the horror genre works on the paradigm of “vision”: if postmodern film shows a crisis of the “truth” in the vision, in horror movies the excess of vision becomes “hyper-vision” – that is “multiplication” of the death/blood/torture visions – and “intra-vision”, that shows the impossibility of recognizing the “real” vision from the virtual/imaginary. In this perspective, the textual and cultural forms and strategies of postmodern horror film are predominantly: the “database-accumulation” forms, where the events result from a very simple “remote cause” serving as a pretext (like in Night of the Living Dead); the “database-catalogue” forms, where the events follow one another displaying a “central” character or theme. In the first case, the catalogue syntagms are connected by “consecutive” elements, building stories linked by the actions of a single character (usually the killer), or connected by non-consecutive episodes about a general theme: examples of the first kind are built on the model of The Wizard of Gore; the second ones, on the films such as Mario Bava’s I tre volti della paura. The “navigable space” forms are defined: hyperlink a, where one universe is fluctuating between reality and dream, as in Rosemary’s Baby; hyperlink b (where two non-hierarchical universes are convergent, the first one real and the other one fictional, as in the Nightmare series); hyperlink c (where more worlds are separated but contiguous in the last sequence, as in Targets); the last form, navigable-loop, includes a textual line which suddenly stops and starts again, reflecting the pattern of a “loop” (as in Lost Highway). This essay analyses in detail the organization of “visual space” into the postmodern horror film by tracing representative patterns. It concludes by examining the “convergence”3 of technologies and cognitive structures of cinema and new media.
Resumo:
The labyrinthum Capella quoted in the title (from a Prudentius of Troyes epistle) represents the allegory of the studium of the liberal arts and the looking for knowledge in the early middle age. This is a capital problem in the early Christianity and, in general, for all the western world, concerning the relationship between faith and science. I studied the evolution of this subject from its birth to Carolingian age, focusing on the most relevant figures, for the western Europe, such Saint Augustine (De doctrina christiana), Martianus Capella (De Nuptiis Philologiae et Mercurii) and Iohannes Scotus Eriugena (Annotationes in Marcianum). Clearly it emerges that there were two opposite ways about this relatioship. According to the first, the human being is capable of get a knowledge about God thanks to its own reason and logical thought processes (by the analysis of the nature as a Speculum Dei); on the other way, only the faith and the grace could give the man the possibility to perceive God, and the Bible is the only book men need to know. From late antiquity to Iohannes Scotus times, a few christian and pagan authors fall into line with first position (the neoplatonic one): Saint Augustine (first part of his life, then he retracted some of his views), Martianus, Calcidius and Macrobius. Other philosophers were not neoplatonic bat believed in the power of the studium: Boethius, Cassiodorus, Isidorus of Seville, Hrabanus Maurus and Lupus of Ferriéres. In order to get an idea of this conception, I finally focused the research on Iohannes Scotus Eriugena's Annotationes in Marcianum. I commented Eriugena's work phrase by phrase trying to catch the sense of his words, the reference, philosophical influences, to trace antecedents and its clouts to later middle age and Chartres school. In this scholastic text Eriugena comments the Capella's work and poses again the question of the studium to his students. Iohannes was a magister in schola Palatina during the time of Carl the Bald, he knew Saint Augustine works, and he knew Boethius, Calcidius, Macrobius, Isidorus and Cassiodorus ones too. He translated Pseudo-Dionysius the Areopagite and Maximus the Confessor. He had a neoplatonic view of Christianity and tried to harmonize the impossibility to know God to man's intellectual capability to get a glimpse of God through the study of the nature. According to this point of view, Eriugena's comment of Martianus Capella was no more a secondary work. It gets more and more importance to understand his research and his mystic, and to understand and really grasp the inner sense of his chief work Periphyseon.
Resumo:
The application of Concurrency Theory to Systems Biology is in its earliest stage of progress. The metaphor of cells as computing systems by Regev and Shapiro opened the employment of concurrent languages for the modelling of biological systems. Their peculiar characteristics led to the design of many bio-inspired formalisms which achieve higher faithfulness and specificity. In this thesis we present pi@, an extremely simple and conservative extension of the pi-calculus representing a keystone in this respect, thanks to its expressiveness capabilities. The pi@ calculus is obtained by the addition of polyadic synchronisation and priority to the pi-calculus, in order to achieve compartment semantics and atomicity of complex operations respectively. In its direct application to biological modelling, the stochastic variant of the calculus, Spi@, is shown able to model consistently several phenomena such as formation of molecular complexes, hierarchical subdivision of the system into compartments, inter-compartment reactions, dynamic reorganisation of compartment structure consistent with volume variation. The pivotal role of pi@ is evidenced by its capability of encoding in a compositional way several bio-inspired formalisms, so that it represents the optimal core of a framework for the analysis and implementation of bio-inspired languages. In this respect, the encodings of BioAmbients, Brane Calculi and a variant of P Systems in pi@ are formalised. The conciseness of their translation in pi@ allows their indirect comparison by means of their encodings. Furthermore it provides a ready-to-run implementation of minimal effort whose correctness is granted by the correctness of the respective encoding functions. Further important results of general validity are stated on the expressive power of priority. Several impossibility results are described, which clearly state the superior expressiveness of prioritised languages and the problems arising in the attempt of providing their parallel implementation. To this aim, a new setting in distributed computing (the last man standing problem) is singled out and exploited to prove the impossibility of providing a purely parallel implementation of priority by means of point-to-point or broadcast communication.
Resumo:
A very recent and exciting new area of research is the application of Concurrency Theory tools to formalize and analyze biological systems and one of the most promising approach comes from the process algebras (process calculi). A process calculus is a formal language that allows to describe concurrent systems and comes with well-established techniques for quantitative and qualitative analysis. Biological systems can be regarded as concurrent systems and therefore modeled by means of process calculi. In this thesis we focus on the process calculi approach to the modeling of biological systems and investigate, mostly from a theoretical point of view, several promising bio-inspired formalisms: Brane Calculi and k-calculus family. We provide several expressiveness results mostly by means of comparisons between calculi. We provide a lower bound to the computational power of the non Turing complete MDB Brane Calculi by showing an encoding of a simple P-System into MDB. We address the issue of local implementation within the k-calculus family: whether n-way rewrites can be simulated by binary interactions only. A solution introducing divergence is provided and we prove a deterministic solution preserving the termination property is not possible. We use the symmetric leader election problem to test synchronization capabilities within the k-calculus family. Several fragments of the original k-calculus are considered and we prove an impossibility result about encoding n-way synchronization into (n-1)-way synchronization. A similar impossibility result is obtained in a pure computer science context. We introduce CCSn, an extension of CCS with multiple input prefixes and show, using the dining philosophers problem, that there is no reasonable encoding of CCS(n+1) into CCSn.
Resumo:
In this thesis we have developed solutions to common issues regarding widefield microscopes, facing the problem of the intensity inhomogeneity of an image and dealing with two strong limitations: the impossibility of acquiring either high detailed images representative of whole samples or deep 3D objects. First, we cope with the problem of the non-uniform distribution of the light signal inside a single image, named vignetting. In particular we proposed, for both light and fluorescent microscopy, non-parametric multi-image based methods, where the vignetting function is estimated directly from the sample without requiring any prior information. After getting flat-field corrected images, we studied how to fix the problem related to the limitation of the field of view of the camera, so to be able to acquire large areas at high magnification. To this purpose, we developed mosaicing techniques capable to work on-line. Starting from a set of overlapping images manually acquired, we validated a fast registration approach to accurately stitch together the images. Finally, we worked to virtually extend the field of view of the camera in the third dimension, with the purpose of reconstructing a single image completely in focus, stemming from objects having a relevant depth or being displaced in different focus planes. After studying the existing approaches for extending the depth of focus of the microscope, we proposed a general method that does not require any prior information. In order to compare the outcome of existing methods, different standard metrics are commonly used in literature. However, no metric is available to compare different methods in real cases. First, we validated a metric able to rank the methods as the Universal Quality Index does, but without needing any reference ground truth. Second, we proved that the approach we developed performs better in both synthetic and real cases.
Resumo:
MultiProcessor Systems-on-Chip (MPSoC) are the core of nowadays and next generation computing platforms. Their relevance in the global market continuously increase, occupying an important role both in everydaylife products (e.g. smartphones, tablets, laptops, cars) and in strategical market sectors as aviation, defense, robotics, medicine. Despite of the incredible performance improvements in the recent years processors manufacturers have had to deal with issues, commonly called “Walls”, that have hindered the processors development. After the famous “Power Wall”, that limited the maximum frequency of a single core and marked the birth of the modern multiprocessors system-on-chip, the “Thermal Wall” and the “Utilization Wall” are the actual key limiter for performance improvements. The former concerns the damaging effects of the high temperature on the chip caused by the large power densities dissipation, whereas the second refers to the impossibility of fully exploiting the computing power of the processor due to the limitations on power and temperature budgets. In this thesis we faced these challenges by developing efficient and reliable solutions able to maximize performance while limiting the maximum temperature below a fixed critical threshold and saving energy. This has been possible by exploiting the Model Predictive Controller (MPC) paradigm that solves an optimization problem subject to constraints in order to find the optimal control decisions for the future interval. A fully-distributedMPC-based thermal controller with a far lower complexity respect to a centralized one has been developed. The control feasibility and interesting properties for the simplification of the control design has been proved by studying a partial differential equation thermal model. Finally, the controller has been efficiently included in more complex control schemes able to minimize energy consumption and deal with mixed-criticalities tasks
Resumo:
INTRODUCTION Echocardiography is the standard clinical approach for quantification of the severity of aortic stenosis (AS). A comprehensive examination of its overall reproducibility and the simultaneous estimation of its variance components by multiple operators, readers, probe applications, and beats have not been undertaken. METHOD AND RESULTS Twenty-seven subjects with AS were scanned over 7 months in the echo-department by a median of 3 different operators. From each patient and each operator multiple runs of beats from multiple probe positions were stored for later analysis by multiple readers. The coefficient of variation was 13.3%, 15.9%, 17.6%, and 20.2% for the aortic peak velocity (Vmax), and velocity time integral (VTI), and left ventricular outflow tract (LVOT) Vmax and VTI respectively. The largest individual contributors to the overall variability were the beat-to-beat variability (9.0%, 9.3%, 9.5%, 9.4% respectively) and that of inability of an individual operator to precisely apply the probe to the same position twice (8.3%, 9.4%, 12.9%, 10.7% respectively). The tracing (inter-reader) and reader (inter-reader), and operator (inter-operator) contribution were less important. CONCLUSIONS Reproducibility of measurements in AS is poorer than often reported in the literature. The source of this variability does not appear, as traditionally believed, to result from a lack of training or operator and reader specific factors. Rather the unavoidable beat-to-beat biological variability, and the inherent impossibility of applying the ultrasound probe in exactly the same position each time are the largest contributors. Consequently, guidelines suggesting greater standardisation of procedures and further training for sonographers are unlikely to result in an improvement in precision. Clinicians themselves should be wary of relying on even three-beat averages as their expected coefficient of variance is 10.3% for the peak velocity at the aortic valve.
Resumo:
Entro l’approccio concettuale e metodologico transdisciplinare della Scienza della Sostenibilità, la presente tesi elabora un background teorico per concettualizzare una definizione di sostenibilità sulla cui base proporre un modello di sviluppo alternativo a quello dominante, declinato in termini di proposte concrete entro il caso-studio di regolazione europea in materia di risparmio energetico. La ricerca, attraverso un’analisi transdisciplinare, identifica una crisi strutturale del modello di sviluppo dominante basato sulla crescita economica quale (unico) indicatore di benessere e una crisi valoriale. L’attenzione si concentra quindi sull’individuazione di un paradigma idoneo a rispondere alle criticità emerse dall’analisi. A tal fine vengono esaminati i concetti di sviluppo sostenibile e di sostenibilità, arrivando a proporre un nuovo paradigma (la “sostenibilità ecosistemica”) che dia conto dell’impossibilità di una crescita infinita su un sistema caratterizzato da risorse limitate. Vengono poi presentate delle proposte per un modello di sviluppo sostenibile alternativo a quello dominante. Siffatta elaborazione teorica viene declinata in termini concreti mediante l’elaborazione di un caso-studio. A tal fine, viene innanzitutto analizzata la funzione della regolazione come strumento per garantire l’applicazione pratica del modello teorico. L’attenzione è concentrata sul caso-studio rappresentato dalla politica e regolazione dell’Unione Europea in materia di risparmio ed efficienza energetica. Dall’analisi emerge una progressiva commistione tra i due concetti di risparmio energetico ed efficienza energetica, per la quale vengono avanzate delle motivazioni e individuati dei rischi in termini di effetti rebound. Per rispondere alle incongruenze tra obiettivo proclamato dall’Unione Europea di riduzione dei consumi energetici e politica effettivamente perseguita, viene proposta una forma di “regolazione per la sostenibilità” in ambito abitativo residenziale che, promuovendo la condivisione dei servizi energetici, recuperi il significato proprio di risparmio energetico come riduzione del consumo mediante cambiamenti di comportamento, arricchendolo di una nuova connotazione come “bene relazionale” per la promozione del benessere relazionale ed individuale.
Resumo:
Oggetto di questa tesi è l’analisi delle modalità di rappresentazione del trauma nel romanzo del Novecento e, in particolare, nelle opere di Samuel Beckett, Georges Perec e Agota Kristof. Fondamento dello studio sarà una disamina dei procedimenti linguistici e narrativi di rappresentazione del trauma nelle prose degli autori citati, al fine tracciare le linee di un’estetica in grado di descrivere le caratteristiche peculiari delle narrazioni in cui la dimensione antinarrativa della memoria traumatica assume il ruolo di principio estetico guida. L’analisi si soffermerà sulla cruciale relazione esistente, in tutti e tre gli autori, tra rappresentazione del trauma e sviluppo di strategie narrativi definibili come “denegative”. L’analisi dei testi letterari è condotta sulla base del corpus critico dei Trauma Studies, dell’ermeneutica della narrazione di stampo ricœuriano e della teoria del linguaggio psicoanalitica e affiancata, ove possibile, da uno studio filologico-genetico dei materiali d’autore. Alla luce di tali premesse, intendo rivalutare il carattere rappresentativo e testimoniale della letteratura del secolo scorso, in contrasto con la consuetudine a vedere nel romanzo novecentesco il trionfo dell’antimimesi e il declino del racconto. Dal momento che le narrazioni traumatiche si costruiscono intorno e attraverso i vuoti di linguaggio, la tesi è che siano proprio questi vuoti linguistici e narrativi (amnesie, acronie, afasie, lapsus, omissioni e mancanze ancora più sofisticate come nel caso di Perec) a rappresentare, in modo mimetico, la realtà apparentemente inaccessibile del trauma. Si tenterà di dimostrare come questi nuovi canoni di rappresentazione non denuncino l’impossibilità del racconto, bensì una sfida al silenzio, celata in più sottili e complesse convenzioni narrative, le quali mantengono un rapporto di filiazione indiretto − per una via che potremmo definire denegativa − con quelle del romanzo tradizionale.