351 resultados para Proofs
Resumo:
The thesis deals with the modularity conjecture for three-dimensional Calabi-Yau varieties. This is a generalization of the work of A. Wiles and others on modularity of elliptic curves. Modularity connects the number of points on varieties with coefficients of certain modular forms. In chapter 1 we collect the basics on arithmetic on Calabi-Yau manifolds, including general modularity results and strategies for modularity proofs. In chapters 2, 3, 4 and 5 we investigate examples of modular Calabi-Yau threefolds, including all examples occurring in the literature and many new ones. Double octics, i.e. Double coverings of projective 3-space branched along an octic surface, are studied in detail. In chapter 6 we deal with examples connected with the same modular forms. According to the Tate conjecture there should be correspondences between them. Many correspondences are constructed explicitly. We finish by formulating conjectures on the occurring newforms, especially their levels. In the appendices we compile tables of coefficients of weight 2 and weight 4 newforms and many examples of double octics.
Resumo:
Interactive theorem provers are tools designed for the certification of formal proofs developed by means of man-machine collaboration. Formal proofs obtained in this way cover a large variety of logical theories, ranging from the branches of mainstream mathematics, to the field of software verification. The border between these two worlds is marked by results in theoretical computer science and proofs related to the metatheory of programming languages. This last field, which is an obvious application of interactive theorem proving, poses nonetheless a serious challenge to the users of such tools, due both to the particularly structured way in which these proofs are constructed, and to difficulties related to the management of notions typical of programming languages like variable binding. This thesis is composed of two parts, discussing our experience in the development of the Matita interactive theorem prover and its use in the mechanization of the metatheory of programming languages. More specifically, part I covers: - the results of our effort in providing a better framework for the development of tactics for Matita, in order to make their implementation and debugging easier, also resulting in a much clearer code; - a discussion of the implementation of two tactics, providing infrastructure for the unification of constructor forms and the inversion of inductive predicates; we point out interactions between induction and inversion and provide an advancement over the state of the art. In the second part of the thesis, we focus on aspects related to the formalization of programming languages. We describe two works of ours: - a discussion of basic issues we encountered in our formalizations of part 1A of the Poplmark challenge, where we apply the extended inversion principles we implemented for Matita; - a formalization of an algebraic logical framework, posing more complex challenges, including multiple binding and a form of hereditary substitution; this work adopts, for the encoding of binding, an extension of Masahiko Sato's canonical locally named representation we designed during our visit to the Laboratory for Foundations of Computer Science at the University of Edinburgh, under the supervision of Randy Pollack.
Resumo:
The Spin-Statistics theorem states that the statistics of a system of identical particles is determined by their spin: Particles of integer spin are Bosons (i.e. obey Bose-Einstein statistics), whereas particles of half-integer spin are Fermions (i.e. obey Fermi-Dirac statistics). Since the original proof by Fierz and Pauli, it has been known that the connection between Spin and Statistics follows from the general principles of relativistic Quantum Field Theory. In spite of this, there are different approaches to Spin-Statistics and it is not clear whether the theorem holds under assumptions that are different, and even less restrictive, than the usual ones (e.g. Lorentz-covariance). Additionally, in Quantum Mechanics there is a deep relation between indistinguishabilty and the geometry of the configuration space. This is clearly illustrated by Gibbs' paradox. Therefore, for many years efforts have been made in order to find a geometric proof of the connection between Spin and Statistics. Recently, various proposals have been put forward, in which an attempt is made to derive the Spin-Statistics connection from assumptions different from the ones used in the relativistic, quantum field theoretic proofs. Among these, there is the one due to Berry and Robbins (BR), based on the postulation of a certain single-valuedness condition, that has caused a renewed interest in the problem. In the present thesis, we consider the problem of indistinguishability in Quantum Mechanics from a geometric-algebraic point of view. An approach is developed to study configuration spaces Q having a finite fundamental group, that allows us to describe different geometric structures of Q in terms of spaces of functions on the universal cover of Q. In particular, it is shown that the space of complex continuous functions over the universal cover of Q admits a decomposition into C(Q)-submodules, labelled by the irreducible representations of the fundamental group of Q, that can be interpreted as the spaces of sections of certain flat vector bundles over Q. With this technique, various results pertaining to the problem of quantum indistinguishability are reproduced in a clear and systematic way. Our method is also used in order to give a global formulation of the BR construction. As a result of this analysis, it is found that the single-valuedness condition of BR is inconsistent. Additionally, a proposal aiming at establishing the Fermi-Bose alternative, within our approach, is made.
Resumo:
Die Enzyme des Carotinoidstoffwechsels spalten Provitamin A-Carotinoide in wichtige Retinoide (z.B. Vitamin A, Retinsäure), die Organismen während der Entwicklung und in visuellen Systemen benötigen. Die vorliegende Arbeit präsentiert erstmalig eine Carotinoxygenase (BCO) aus Schwämmen (S. domuncula), die einzigartig im Tierreich ist und nur einen orthologen Vertreter in Pflanzen (Crocus sativus) wieder findet. Das Enzym ist eine 7,8(7’,8’)-Carotinoxygenase, die C40-Carotinoide zu einem C10-Apocarotinoid und 8’-Apocarotinal spaltet. Mittels HPLC wurden sowohl die Primärspaltprodukte von β-Carotin, Lykopin und Zeaxanthin als auch das für alle identische innere Kettenstück (Crocetin) bei Doppelspaltung nachgewiesen. Der Nachweis der BCO-Transkripte (unter anderem in-situ) belegt eine Beteiligung des Enzyms während Entwicklungsprozessen und offenbart sowohl eine streng räumlich-zeitliche als auch eine über Rückkopplungsprozesse gesteuerte Regulierung des Enzyms. Ein weiteres hier identifiziertes Gen ähnelt einer bakteriellen Apocarotinoidoxygenase (ACO), welche das 8’-Apocarotinal der BCO erneut spaltet und so Retinal generiert. Letzteres dient als Chromophor zahlreicher visueller Systeme und kann über Enzyme des Retinoidstoffwechsels entweder gespeichert, oder in das wichtige Morphogen Retinsäure umgesetzt werden. Hier werden zwei potentielle Enzyme vorgestellt, die an dieser Interkonversion Retinal/Retinol (Speicher) beteiligt sein könnten als auch eines, das evtl. Retinal zu Retinsäure umsetzt. Die hier vorgestellten Ergebnisse unterstützen die Hypothese, dass Retinsäure kein autapomorphes Morphogen der Chordaten darstellt.
Resumo:
Negli ultimi anni l'avanzamento della ricerca in campo crittografico ha portato alla necessità di utilizzare strumenti software per la verifica delle prove formali. EasyCrypt è un tool, in fase di sviluppo, ideato per dimostrazioni basate su sequenze di giochi. Nella presente tesi viene presentato un caso di studio riguardante l'applicabilità di EasyCrypt alle mixing networks (abbr. mixnets). è presente un'esaustiva rassegna delle mixnets proposte in letteratura ed è descritta la teoria alla base delle dimostrazioni critografiche basate su sequenze di giochi. EasyCrypt viene analizzato nei suoi fondamenti teorici nonché nelle applicazioni pratiche, con riferimento particolare agli esperimenti svolti per la dimostrazione di proprietà di sicurezza delle mixnets.
Resumo:
The Curry-Howard isomorphism is the idea that proofs in natural deduction can be put in correspondence with lambda terms in such a way that this correspondence is preserved by normalization. The concept can be extended from Intuitionistic Logic to other systems, such as Linear Logic. One of the nice conseguences of this isomorphism is that we can reason about functional programs with formal tools which are typical of proof systems: such analysis can also include quantitative qualities of programs, such as the number of steps it takes to terminate. Another is the possiblity to describe the execution of these programs in terms of abstract machines. In 1990 Griffin proved that the correspondence can be extended to Classical Logic and control operators. That is, Classical Logic adds the possiblity to manipulate continuations. In this thesis we see how the things we described above work in this larger context.
Resumo:
Circulating Fibrocytes (CFs) are bone marrow-derived mesenchymal progenitor cells that express a similar pattern of surface markers related to leukocytes, hematopoietic progenitor cells and fibroblasts. CFs precursor display an ability to differentiate into fibroblasts and Myofibroblasts, as well as adipocytes. Fibrocytes have been shown to contribute to tissue fibrosis in the end-stage renal disease (ESRD), as well as in other fibrotic diseases, leading to fibrogenic process in other organs including lung, cardiac, gut and liver. This evidence has been confirmed by several experimental proofs in mice models of kidney injury. In the present study, we developed a protocol for the study of CFs, by using peripheral blood monocytes cells (PBMCs) samples collected from healthy human volunteers. Thanks to a flow cytometry method, in vitro culture assays and the gene expression assays, we are able to study and characterize this CFs population. Moreover, results confirmed that these approaches are reliable and reproducible for the investigation of the circulating fibrocytes population in whole blood samples. Our final aim is to confirm the presence of a correlation between the renal fibrosis progression, and the different circulating fibrocyte levels in Chronic Kidney Disease (CKD) patients. Thanks to a protocol study presented and accepted by the Ethic Committee we are continuing the study of CFs induction in a cohort of sixty patients affected by CKD, divided in three distinct groups for different glomerular filtration rate (GFR) levels, plus a control group of thirty healthy subjects. Ongoing experiments will determine whether circulating fibrocytes represent novel biomarkers for the study of CKD progression, in the early and late phases of this disease.
Resumo:
The use of linear programming in various areas has increased with the significant improvement of specialized solvers. Linear programs are used as such to model practical problems, or as subroutines in algorithms such as formal proofs or branch-and-cut frameworks. In many situations a certified answer is needed, for example the guarantee that the linear program is feasible or infeasible, or a provably safe bound on its objective value. Most of the available solvers work with floating-point arithmetic and are thus subject to its shortcomings such as rounding errors or underflow, therefore they can deliver incorrect answers. While adequate for some applications, this is unacceptable for critical applications like flight controlling or nuclear plant management due to the potential catastrophic consequences. We propose a method that gives a certified answer whether a linear program is feasible or infeasible, or returns unknown'. The advantage of our method is that it is reasonably fast and rarely answers unknown'. It works by computing a safe solution that is in some way the best possible in the relative interior of the feasible set. To certify the relative interior, we employ exact arithmetic, whose use is nevertheless limited in general to critical places, allowing us to rnremain computationally efficient. Moreover, when certain conditions are fulfilled, our method is able to deliver a provable bound on the objective value of the linear program. We test our algorithm on typical benchmark sets and obtain higher rates of success compared to previous approaches for this problem, while keeping the running times acceptably small. The computed objective value bounds are in most of the cases very close to the known exact objective values. We prove the usability of the method we developed by additionally employing a variant of it in a different scenario, namely to improve the results of a Satisfiability Modulo Theories solver. Our method is used as a black box in the nodes of a branch-and-bound tree to implement conflict learning based on the certificate of infeasibility for linear programs consisting of subsets of linear constraints. The generated conflict clauses are in general small and give good rnprospects for reducing the search space. Compared to other methods we obtain significant improvements in the running time, especially on the large instances.
Resumo:
A new control scheme has been presented in this thesis. Based on the NonLinear Geometric Approach, the proposed Active Control System represents a new way to see the reconfigurable controllers for aerospace applications. The presence of the Diagnosis module (providing the estimation of generic signals which, based on the case, can be faults, disturbances or system parameters), mean feature of the depicted Active Control System, is a characteristic shared by three well known control systems: the Active Fault Tolerant Controls, the Indirect Adaptive Controls and the Active Disturbance Rejection Controls. The standard NonLinear Geometric Approach (NLGA) has been accurately investigated and than improved to extend its applicability to more complex models. The standard NLGA procedure has been modified to take account of feasible and estimable sets of unknown signals. Furthermore the application of the Singular Perturbations approximation has led to the solution of Detection and Isolation problems in scenarios too complex to be solved by the standard NLGA. Also the estimation process has been improved, where multiple redundant measuremtent are available, by the introduction of a new algorithm, here called "Least Squares - Sliding Mode". It guarantees optimality, in the sense of the least squares, and finite estimation time, in the sense of the sliding mode. The Active Control System concept has been formalized in two controller: a nonlinear backstepping controller and a nonlinear composite controller. Particularly interesting is the integration, in the controller design, of the estimations coming from the Diagnosis module. Stability proofs are provided for both the control schemes. Finally, different applications in aerospace have been provided to show the applicability and the effectiveness of the proposed NLGA-based Active Control System.
Resumo:
In this thesis I analyzed the microwave tomography method to recognize breast can- cer. I study how identify the dielectric permittivity, the Helmoltz equation parameter used to model the real physic problem. Through a non linear least squares method I solve a problem of parameters identification; I show the theoric approach and the devel- opment to reach the results. I use the Levenberg-Marquardt algorithm, applied on COMSOL software to multiphysic models; so I do numerical proofs on semplified test problems compared to the specific real problem to solve.
Resumo:
The response of some Argentine workers to the 2001 crisis of neoliberalism gave rise to a movement of worker-recovered enterprises (empresas recuperadas por sus trabajadores or ERTs). The ERTs have emerged as former employees took over the control of generally fraudulently bankrupt factories and enterprises. The analysis of the ERT movement within the neoliberal global capitalist order will draw from William Robinson’s (2004) neo-Gramscian concept of hegemony. The theoretical framework of neo-Gramscian hegemony will be used in exposing the contradictions of capitalism on the global, national, organizational and individual scales and the effects they have on the ERT movement. The ERT movement has demonstrated strong level of resilience, despite the numerous economic, social, political and cultural challenges and limitations it faces as a consequence of the implementation of neoliberalism globally. ERTs have shown that through non-violent protests, democratic principles of management and social inclusion, it is possible to start constructing an alternative social order that is based on the cooperative principles of “honesty, openness, social responsibility and caring for others” (ICA 2007) as opposed to secrecy, exclusiveness, individualism and self-interestedness. In order to meet this “utopian” vision, it is essential to push the limits of the possible within the current social order and broaden the alliance to include the organized members of the working class, such as the members of trade unions, and the unorganized, such as the unemployed and underemployed. Though marginal in number and size, the members of ERTs have given rise to a model that is worth exploring in other countries and regions burdened by the contradictory workings of capitalism. Today, ERTs serve as living proofs that workers too are capable of successfully running businesses, not capitalists alone.
Resumo:
Justification Logic studies epistemic and provability phenomena by introducing justifications/proofs into the language in the form of justification terms. Pure justification logics serve as counterparts of traditional modal epistemic logics, and hybrid logics combine epistemic modalities with justification terms. The computational complexity of pure justification logics is typically lower than that of the corresponding modal logics. Moreover, the so-called reflected fragments, which still contain complete information about the respective justification logics, are known to be in~NP for a wide range of justification logics, pure and hybrid alike. This paper shows that, under reasonable additional restrictions, these reflected fragments are NP-complete, thereby proving a matching lower bound. The proof method is then extended to provide a uniform proof that the corresponding full pure justification logics are $\Pi^p_2$-hard, reproving and generalizing an earlier result by Milnikel.
Resumo:
Self-stabilization is a property of a distributed system such that, regardless of the legitimacy of its current state, the system behavior shall eventually reach a legitimate state and shall remain legitimate thereafter. The elegance of self-stabilization stems from the fact that it distinguishes distributed systems by a strong fault tolerance property against arbitrary state perturbations. The difficulty of designing and reasoning about self-stabilization has been witnessed by many researchers; most of the existing techniques for the verification and design of self-stabilization are either brute-force, or adopt manual approaches non-amenable to automation. In this dissertation, we first investigate the possibility of automatically designing self-stabilization through global state space exploration. In particular, we develop a set of heuristics for automating the addition of recovery actions to distributed protocols on various network topologies. Our heuristics equally exploit the computational power of a single workstation and the available parallelism on computer clusters. We obtain existing and new stabilizing solutions for classical protocols like maximal matching, ring coloring, mutual exclusion, leader election and agreement. Second, we consider a foundation for local reasoning about self-stabilization; i.e., study the global behavior of the distributed system by exploring the state space of just one of its components. It turns out that local reasoning about deadlocks and livelocks is possible for an interesting class of protocols whose proof of stabilization is otherwise complex. In particular, we provide necessary and sufficient conditions – verifiable in the local state space of every process – for global deadlock- and livelock-freedom of protocols on ring topologies. Local reasoning potentially circumvents two fundamental problems that complicate the automated design and verification of distributed protocols: (1) state explosion and (2) partial state information. Moreover, local proofs of convergence are independent of the number of processes in the network, thereby enabling our assertions about deadlocks and livelocks to apply on rings of arbitrary sizes without worrying about state explosion.
Resumo:
SWISSspine is a so-called pragmatic trial for assessment of safety and efficiency of total disc arthroplasty (TDA). It follows the new health technology assessment (HTA) principle of "coverage with evidence development". It is the first mandatory HTA registry of its kind in the history of Swiss orthopaedic surgery. Its goal is the generation of evidence for a decision by the Swiss federal office of health about reimbursement of the concerned technologies and treatments by the basic health insurance of Switzerland. During the time between March 2005 and 2008, 427 interventions with implantation of 497 lumbar total disc arthroplasties have been documented. Data was collected in a prospective, observational multicenter mode. The preliminary timeframe for the registry was 3 years and has already been extended. Data collection happens pre- and perioperatively, at the 3 months and 1-year follow-up and annually thereafter. Surgery, implant and follow-up case report forms are administered by spinal surgeons. Comorbidity questionnaires, NASS and EQ-5D forms are completed by the patients. Significant and clinically relevant reduction of low back pain VAS (70.3-29.4 points preop to 1-year postop, p < 0.0001) leg pain VAS (55.5-19.1 points preop to 1-year postop, p < 0.001), improvement of quality of life (EQ-5D, 0.32-0.73 points preop to 1-year postop, p < 0.001) and reduction of pain killer consumption was revealed at the 1-year follow-up. There were 14 (3.9%) complications and 7 (2.0%) revisions within the same hospitalization reported for monosegmental TDA; there were 6 (8.6%) complications and 8 (11.4%) revisions for bisegmental surgery. There were 35 patients (9.8%) with complications during followup in monosegmental and 9 (12.9%) in bisegmental surgery and 11 (3.1%) revisions with 1 [corrected] new hospitalization in monosegmental and 1 (1.4%) in bisegmental surgery. Regression analysis suggested a preoperative VAS "threshold value" of about 44 points for increased likelihood of a minimum clinically relevant back pain improvement. In a short-term perspective, lumbar TDA appears as a relatively safe and efficient procedure concerning pain reduction and improvement of quality of life. Nevertheless, no prediction about the long-term goals of TDA can be made yet. The SWISSspine registry proofs to be an excellent tool for collection of observational data in a nationwide framework whereby advantages and deficits of its design must be considered. It can act as a model for similar projects in other health-care domains.
Resumo:
The first part of this paper provides a comprehensive and self-contained account of the interrelationships between algebraic properties of varieties and properties of their free algebras and equational consequence relations. In particular, proofs are given of known equivalences between the amalgamation property and the Robinson property, the congruence extension property and the extension property, and the flat amalgamation property and the deductive interpolation property, as well as various dependencies between these properties. These relationships are then exploited in the second part of the paper in order to provide new proofs of amalgamation and deductive interpolation for the varieties of lattice-ordered abelian groups and MV-algebras, and to determine important subvarieties of residuated lattices where these properties hold or fail. In particular, a full description is given of all subvarieties of commutative GMV-algebras possessing the amalgamation property.