5 resultados para black-box modelling

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Interactive theorem provers (ITP for short) are tools whose final aim is to certify proofs written by human beings. To reach that objective they have to fill the gap between the high level language used by humans for communicating and reasoning about mathematics and the lower level language that a machine is able to “understand” and process. The user perceives this gap in terms of missing features or inefficiencies. The developer tries to accommodate the user requests without increasing the already high complexity of these applications. We believe that satisfactory solutions can only come from a strong synergy between users and developers. We devoted most part of our PHD designing and developing the Matita interactive theorem prover. The software was born in the computer science department of the University of Bologna as the result of composing together all the technologies developed by the HELM team (to which we belong) for the MoWGLI project. The MoWGLI project aimed at giving accessibility through the web to the libraries of formalised mathematics of various interactive theorem provers, taking Coq as the main test case. The motivations for giving life to a new ITP are: • study the architecture of these tools, with the aim of understanding the source of their complexity • exploit such a knowledge to experiment new solutions that, for backward compatibility reasons, would be hard (if not impossible) to test on a widely used system like Coq. Matita is based on the Curry-Howard isomorphism, adopting the Calculus of Inductive Constructions (CIC) as its logical foundation. Proof objects are thus, at some extent, compatible with the ones produced with the Coq ITP, that is itself able to import and process the ones generated using Matita. Although the systems have a lot in common, they share no code at all, and even most of the algorithmic solutions are different. The thesis is composed of two parts where we respectively describe our experience as a user and a developer of interactive provers. In particular, the first part is based on two different formalisation experiences: • our internship in the Mathematical Components team (INRIA), that is formalising the finite group theory required to attack the Feit Thompson Theorem. To tackle this result, giving an effective classification of finite groups of odd order, the team adopts the SSReflect Coq extension, developed by Georges Gonthier for the proof of the four colours theorem. • our collaboration at the D.A.M.A. Project, whose goal is the formalisation of abstract measure theory in Matita leading to a constructive proof of Lebesgue’s Dominated Convergence Theorem. The most notable issues we faced, analysed in this part of the thesis, are the following: the difficulties arising when using “black box” automation in large formalisations; the impossibility for a user (especially a newcomer) to master the context of a library of already formalised results; the uncomfortable big step execution of proof commands historically adopted in ITPs; the difficult encoding of mathematical structures with a notion of inheritance in a type theory without subtyping like CIC. In the second part of the manuscript many of these issues will be analysed with the looking glasses of an ITP developer, describing the solutions we adopted in the implementation of Matita to solve these problems: integrated searching facilities to assist the user in handling large libraries of formalised results; a small step execution semantic for proof commands; a flexible implementation of coercive subtyping allowing multiple inheritance with shared substructures; automatic tactics, integrated with the searching facilities, that generates proof commands (and not only proof objects, usually kept hidden to the user) one of which specifically designed to be user driven.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper studies relational goods as immaterial assets creating real effects in society. The work starts answering to this question: what kind of effects do relational goods produce? After an accurate literature examination we suppose relational goods are social relations of second order. In the hypotesis they come from the emergence of two distinct social relations: interpersonal and reflexive relations. We describe empirical evidences of these emergent assets in social life and we test the effects they produce with a model. In the work we focus on four targets. First of all we describe the emergence of relational goods through a mathematical model. Then we individualize social realities where relational goods show evident effects and we outline our scientific hypotesis. The following step consists in the formulation of empirical tests. At last we explain final results. Our aim is to set apart the constitutive structure of relational goods into a checkable model coherently with the empirical evidences shown in the research. In the study we use multi-variate analysis techniques to see relational goods in a new way and we use qualitative and quantitative strategies. Relational goods are analysed both as dependent and independent variable in order to consider causative factors acting in a black-box model. Moreover we analyse effects of relational goods inside social spheres, especially in third sector and capitalistic economy. Finally we attain to effective indexes of relational goods in order to compare them with some performance indexes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis deals with an investigation of Decomposition and Reformulation to solve Integer Linear Programming Problems. This method is often a very successful approach computationally, producing high-quality solutions for well-structured combinatorial optimization problems like vehicle routing, cutting stock, p-median and generalized assignment . However, until now the method has always been tailored to the specific problem under investigation. The principal innovation of this thesis is to develop a new framework able to apply this concept to a generic MIP problem. The new approach is thus capable of auto-decomposition and autoreformulation of the input problem applicable as a resolving black box algorithm and works as a complement and alternative to the normal resolving techniques. The idea of Decomposing and Reformulating (usually called in literature Dantzig and Wolfe Decomposition DWD) is, given a MIP, to convexify one (or more) subset(s) of constraints (slaves) and working on the partially convexified polyhedron(s) obtained. For a given MIP several decompositions can be defined depending from what sets of constraints we want to convexify. In this thesis we mainly reformulate MIPs using two sets of variables: the original variables and the extended variables (representing the exponential extreme points). The master constraints consist of the original constraints not included in any slaves plus the convexity constraint(s) and the linking constraints(ensuring that each original variable can be viewed as linear combination of extreme points of the slaves). The solution procedure consists of iteratively solving the reformulated MIP (master) and checking (pricing) if a variable of reduced costs exists, and in which case adding it to the master and solving it again (columns generation), or otherwise stopping the procedure. The advantage of using DWD is that the reformulated relaxation gives bounds stronger than the original LP relaxation, in addition it can be incorporated in a Branch and bound scheme (Branch and Price) in order to solve the problem to optimality. If the computational time for the pricing problem is reasonable this leads in practice to a stronger speed up in the solution time, specially when the convex hull of the slaves is easy to compute, usually because of its special structure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the framework of the micro-CHP (Combined Heat and Power) energy systems and the Distributed Generation (GD) concept, an Integrated Energy System (IES) able to meet the energy and thermal requirements of specific users, using different types of fuel to feed several micro-CHP energy sources, with the integration of electric generators of renewable energy sources (RES), electrical and thermal storage systems and the control system was conceived and built. A 5 kWel Polymer Electrolyte Membrane Fuel Cell (PEMFC) has been studied. Using experimental data obtained from various measurement campaign, the electrical and CHP PEMFC system performance have been determinate. The analysis of the effect of the water management of the anodic exhaust at variable FC loads has been carried out, and the purge process programming logic was optimized, leading also to the determination of the optimal flooding times by varying the AC FC power delivered by the cell. Furthermore, the degradation mechanisms of the PEMFC system, in particular due to the flooding of the anodic side, have been assessed using an algorithm that considers the FC like a black box, and it is able to determine the amount of not-reacted H2 and, therefore, the causes which produce that. Using experimental data that cover a two-year time span, the ageing suffered by the FC system has been tested and analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This PhD Thesis is devoted to the accurate analysis of the physical properties of Active Galactic Nuclei (AGN) and the AGN/host-galaxy interplay. Due to the broad-band AGN emission (from radio to hard X-rays), a multi-wavelength approach is mandatory. Our research is carried out over the COSMOS field, within the context of the XMM-Newton wide-field survey. To date, the COSMOS field is a unique area for comprehensive multi-wavelength studies, allowing us to define a large and homogeneous sample of QSOs with a well-sampled spectral coverage and to keep selection effects under control. Moreover, the broad-band information contained in the COSMOS database is well-suited for a detailed analysis of AGN SEDs, bolometric luminosities and bolometric corrections. In order to investigate the nature of both obscured (Type-2) and unobscured (Type-1) AGN, the observational approach is complemented with a theoretical modelling of the AGN/galaxy co-evolution. The X-ray to optical properties of an X-ray selected Type-1 AGN sample are discussed in the first part. The relationship between X-ray and optical/UV luminosities, parametrized by the spectral index αox, provides a first indication about the nature of the central engine powering the AGN. Since a Type-1 AGN outshines the surrounding environment, it is extremely difficult to constrain the properties of its host-galaxy. Conversely, in Type-2 AGN the host-galaxy light is the dominant component of the optical/near-IR SEDs, severely affecting the recovery of the intrinsic AGN emission. Hence a multi-component SED-fitting code is developed to disentangle the emission of the stellar populationof the galaxy from that associated with mass accretion. Bolometric corrections, luminosities, stellar masses and star-formation rates, correlated with the morphology of Type-2 AGN hosts, are presented in the second part, while the final part concerns a physically-motivated model for the evolution of spheroidal galaxies with a central SMBH. The model is able to reproduce two important stages of galaxy evolution, namely the obscured cold-phase and the subsequent quiescent hot-phase.