844 resultados para Many-valued logic
Resumo:
Sustainable computer systems require some flexibility to adapt to environmental unpredictable changes. A solution lies in autonomous software agents which can adapt autonomously to their environments. Though autonomy allows agents to decide which behavior to adopt, a disadvantage is a lack of control, and as a side effect even untrustworthiness: we want to keep some control over such autonomous agents. How to control autonomous agents while respecting their autonomy? A solution is to regulate agents’ behavior by norms. The normative paradigm makes it possible to control autonomous agents while respecting their autonomy, limiting untrustworthiness and augmenting system compliance. It can also facilitate the design of the system, for example, by regulating the coordination among agents. However, an autonomous agent will follow norms or violate them in some conditions. What are the conditions in which a norm is binding upon an agent? While autonomy is regarded as the driving force behind the normative paradigm, cognitive agents provide a basis for modeling the bindingness of norms. In order to cope with the complexity of the modeling of cognitive agents and normative bindingness, we adopt an intentional stance. Since agents are embedded into a dynamic environment, things may not pass at the same instant. Accordingly, our cognitive model is extended to account for some temporal aspects. Special attention is given to the temporal peculiarities of the legal domain such as, among others, the time in force and the time in efficacy of provisions. Some types of normative modifications are also discussed in the framework. It is noteworthy that our temporal account of legal reasoning is integrated to our commonsense temporal account of cognition. As our intention is to build sustainable reasoning systems running unpredictable environment, we adopt a declarative representation of knowledge. A declarative representation of norms will make it easier to update their system representation, thus facilitating system maintenance; and to improve system transparency, thus easing system governance. Since agents are bounded and are embedded into unpredictable environments, and since conflicts may appear amongst mental states and norms, agent reasoning has to be defeasible, i.e. new pieces of information can invalidate formerly derivable conclusions. In this dissertation, our model is formalized into a non-monotonic logic, namely into a temporal modal defeasible logic, in order to account for the interactions between normative systems and software cognitive agents.
Resumo:
I Max Bill is an intense giornata of a big fresco. An analysis of the main social, artistic and cultural events throughout the twentieth century is needed in order to trace his career through his masterpieces and architectures. Some of the faces of this hypothetical mural painting are, among others, Le Corbusier, Walter Gropius, Ernesto Nathan Rogers, Kandinskij, Klee, Mondrian, Vatongerloo, Ignazio Silone, while the backcloth is given by artistic avant-gardes, Bauhaus, International Exhibitions, CIAM, war events, reconstruction, Milan Triennali, Venice Biennali, the School of Ulm. Architect, even though more known as painter, sculptor, designer and graphic artist, Max Bill attends the Bauhaus as a student in the years 1927-1929, and from this experience derives the main features of a rational, objective, constructive and non figurative art. His research is devoted to give his art a scientific methodology: each work proceeds from the analysis of a problem to the logical and always verifiable solution of the same problem. By means of composition elements (such as rhythm, seriality, theme and its variation, harmony and dissonance), he faces, with consistent results, themes apparently very distant from each other as the project for the H.f.G. or the design for a font. Mathematics are a constant reference frame as field of certainties, order, objectivity: ‘for Bill mathematics are never confined to a simple function: they represent a climate of spiritual certainties, and also the theme of non attempted in its purest state, objectivity of the sign and of the geometrical place, and at the same time restlessness of the infinity: Limited and Unlimited ’. In almost sixty years of activity, experiencing all artistic fields, Max Bill works, projects, designs, holds conferences and exhibitions in Europe, Asia and Americas, confronting himself with the most influencing personalities of the twentieth century. In such a vast scenery, the need to limit the investigation field combined with the necessity to address and analyse the unpublished and original aspect of Bill’s relations with Italy. The original contribution of the present research regards this particular ‘geographic delimitation’; in particular, beyond the deep cultural exchanges between Bill and a series of Milanese architects, most of all with Rogers, two main projects have been addressed: the realtà nuova at Milan Triennale in 1947, and the Contemporary Art Museum in Florence in 1980. It is important to note that these projects have not been previously investigated, and the former never appears in the sources either. These works, together with the most well-known ones, such as the projects for the VI and IX Triennale, and the Swiss pavilion for the Biennale, add important details to the reference frame of the relations which took place between Zurich and Milan. Most of the occasions for exchanges took part in between the Thirties and the Fifties, years during which Bill underwent a significant period of artistic growth. He meets the Swiss progressive architects and the Paris artists from the Abstraction-Création movement, enters the CIAM, collaborates with Le Corbusier to the third volume of his Complete Works, and in Milan he works and gets confronted with the events related to post-war reconstruction. In these years Bill defines his own working methodology, attaining an artistic maturity in his work. The present research investigates the mentioned time period, despite some necessary exceptions. II The official Max Bill bibliography is naturally wide, including spreading works along with ones more devoted to analytical investigation, mainly written in German and often translated into French and English (Max Bill himself published his works in three languages). Few works have been published in Italian and, excluding the catalogue of the Parma exhibition from 1977, they cannot be considered comprehensive. Many publications are exhibition catalogues, some of which include essays written by Max Bill himself, some others bring Bill’s comments in a educational-pedagogical approach, to accompany the observer towards a full understanding of the composition processes of his art works. Bill also left a great amount of theoretical speculations to encourage a critical reading of his works in the form of books edited or written by him, and essays published in ‘Werk’, magazine of the Swiss Werkbund, and other international reviews, among which Domus and Casabella. These three reviews have been important tools of analysis, since they include tracks of some of Max Bill’s architectural works. The architectural aspect is less investigated than the plastic and pictorial ones in all the main reference manuals on the subject: Benevolo, Tafuri and Dal Co, Frampton, Allenspach consider Max Bill as an artist proceeding in his work from Bauhaus in the Ulm experience . A first filing of his works was published in 2004 in the monographic issue of the Spanish magazine 2G, together with critical essays by Karin Gimmi, Stanislaus von Moos, Arthur Rüegg and Hans Frei, and in ‘Konkrete Architektur?’, again by Hans Frei. Moreover, the monographic essay on the Atelier Haus building by Arthur Rüegg from 1997, and the DPA 17 issue of the Catalonia Polytechnic with contributions of Carlos Martì, Bruno Reichlin and Ton Salvadò, the latter publication concentrating on a few Bill’s themes and architectures. An urge to studying and going in depth in Max Bill’s works was marked in 2008 by the centenary of his birth and by a recent rediscovery of Bill as initiator of the ‘minimalist’ tradition in Swiss architecture. Bill’s heirs are both very active in promoting exhibitions, researching and publishing. Jakob Bill, Max Bill’s son and painter himself, recently published a work on Bill’s experience in Bauhaus, and earlier on he had published an in-depth study on ‘Endless Ribbons’ sculptures. Angela Thomas Schmid, Bill’s wife and art historian, published in end 2008 the first volume of a biography on Max Bill and, together with the film maker Eric Schmid, produced a documentary film which was also presented at the last Locarno Film Festival. Both biography and documentary concentrate on Max Bill’s political involvement, from antifascism and 1968 protest movements to Bill experiences as Zurich Municipality councilman and member of the Swiss Confederation Parliament. In the present research, the bibliography includes also direct sources, such as interviews and original materials in the form of letters correspondence and graphic works together with related essays, kept in the max+binia+jakob bill stiftung archive in Zurich. III The results of the present research are organized into four main chapters, each of them subdivided into four parts. The first chapter concentrates on the research field, reasons, tools and methodologies employed, whereas the second one consists of a short biographical note organized by topics, introducing the subject of the research. The third chapter, which includes unpublished events, traces the historical and cultural frame with particular reference to the relations between Max Bill and the Italian scene, especially Milan and the architects Rogers and Baldessari around the Fifties, searching the themes and the keys for interpretation of Bill’s architectures and investigating the critical debate on the reviews and the plastic survey through sculpture. The fourth and last chapter examines four main architectures chosen on a geographical basis, all devoted to exhibition spaces, investigating Max Bill’s composition process related to the pictorial field. Paintings has surely been easier and faster to investigate and verify than the building field. A doctoral thesis discussed in Lausanne in 1977 investigating Max Bill’s plastic and pictorial works, provided a series of devices which were corrected and adapted for the definition of the interpretation grid for the composition structures of Bill’s main architectures. Four different tools are employed in the investigation of each work: a context analysis related to chapter three results; a specific theoretical essay by Max Bill briefly explaining his main theses, even though not directly linked to the very same work of art considered; the interpretation grid for the composition themes derived from a related pictorial work; the architecture drawing and digital three-dimensional model. The double analysis of the architectural and pictorial fields is functional to underlining the relation among the different elements of the composition process; the two fields, however, cannot be compared and they stay, in Max Bill’s works as in the present research, interdependent though self-sufficient. IV An important aspect of Max Bill production is self-referentiality: talking of Max Bill, also through Max Bill, as a need for coherence instead of a method limitation. Ernesto Nathan Rogers describes Bill as the last humanist, and his horizon is the known world but, as the ‘Concrete Art’ of which he is one of the main representatives, his production justifies itself: Max Bill not only found a method, but he autonomously re-wrote the ‘rules of the game’, derived timeless theoretical principles and verified them through a rich and interdisciplinary artistic production. The most recurrent words in the present research work are synthesis, unity, space and logic. These terms are part of Max Bill’s vocabulary and can be referred to his works. Similarly, graphic settings or analytical schemes in this research text referring to or commenting Bill’s architectural projects were drawn up keeping in mind the concise precision of his architectural design. As for Mies van der Rohe, it has been written that Max Bill took art to ‘zero degree’ reaching in this way a high complexity. His works are a synthesis of art: they conceptually encompass all previous and –considered their developments- most of contemporary pictures. Contents and message are generally explicitly declared in the title or in Bill’s essays on his artistic works and architectural projects: the beneficiary is invited to go through and re-build the process of synthesis generating the shape. In the course of the interview with the Milan artist Getulio Alviani, he tells how he would not write more than a page for an essay on Josef Albers: everything was already evident ‘on the surface’ and any additional sentence would be redundant. Two years after that interview, these pages attempt to decompose and single out the elements and processes connected with some of Max Bill’s works which, for their own origin, already contain all possible explanations and interpretations. The formal reduction in favour of contents maximization is, perhaps, Max Bill’s main lesson.
Resumo:
Several activities were conducted during my PhD activity. For the NEMO experiment a collaboration between the INFN/University groups of Catania and Bologna led to the development and production of a mixed signal acquisition board for the Nemo Km3 telescope. The research concerned the feasibility study for a different acquisition technique quite far from that adopted in the NEMO Phase 1 telescope. The DAQ board that we realized exploits the LIRA06 front-end chip for the analog acquisition of anodic an dynodic sources of a PMT (Photo-Multiplier Tube). The low-power analog acquisition allows to sample contemporaneously multiple channels of the PMT at different gain factors in order to increase the signal response linearity over a wider dynamic range. Also the auto triggering and self-event-classification features help to improve the acquisition performance and the knowledge on the neutrino event. A fully functional interface towards the first level data concentrator, the Floor Control Module, has been integrated as well on the board, and a specific firmware has been realized to comply with the present communication protocols. This stage of the project foresees the use of an FPGA, a high speed configurable device, to provide the board with a flexible digital logic control core. After the validation of the whole front-end architecture this feature would be probably integrated in a common mixed-signal ASIC (Application Specific Integrated Circuit). The volatile nature of the configuration memory of the FPGA implied the integration of a flash ISP (In System Programming) memory and a smart architecture for a safe remote reconfiguration of it. All the integrated features of the board have been tested. At the Catania laboratory the behavior of the LIRA chip has been investigated in the digital environment of the DAQ board and we succeeded in driving the acquisition with the FPGA. The PMT pulses generated with an arbitrary waveform generator were correctly triggered and acquired by the analog chip, and successively they were digitized by the on board ADC under the supervision of the FPGA. For the communication towards the data concentrator a test bench has been realized in Bologna where, thanks to a lending of the Roma University and INFN, a full readout chain equivalent to that present in the NEMO phase-1 was installed. These tests showed a good behavior of the digital electronic that was able to receive and to execute command imparted by the PC console and to answer back with a reply. The remotely configurable logic behaved well too and demonstrated, at least in principle, the validity of this technique. A new prototype board is now under development at the Catania laboratory as an evolution of the one described above. This board is going to be deployed within the NEMO Phase-2 tower in one of its floors dedicated to new front-end proposals. This board will integrate a new analog acquisition chip called SAS (Smart Auto-triggering Sampler) introducing thus a new analog front-end but inheriting most of the digital logic present in the current DAQ board discussed in this thesis. For what concern the activity on high-resolution vertex detectors, I worked within the SLIM5 collaboration for the characterization of a MAPS (Monolithic Active Pixel Sensor) device called APSEL-4D. The mentioned chip is a matrix of 4096 active pixel sensors with deep N-well implantations meant for charge collection and to shield the analog electronics from digital noise. The chip integrates the full-custom sensors matrix and the sparsifification/readout logic realized with standard-cells in STM CMOS technology 130 nm. For the chip characterization a test-beam has been set up on the 12 GeV PS (Proton Synchrotron) line facility at CERN of Geneva (CH). The collaboration prepared a silicon strip telescope and a DAQ system (hardware and software) for data acquisition and control of the telescope that allowed to store about 90 million events in 7 equivalent days of live-time of the beam. My activities concerned basically the realization of a firmware interface towards and from the MAPS chip in order to integrate it on the general DAQ system. Thereafter I worked on the DAQ software to implement on it a proper Slow Control interface of the APSEL4D. Several APSEL4D chips with different thinning have been tested during the test beam. Those with 100 and 300 um presented an overall efficiency of about 90% imparting a threshold of 450 electrons. The test-beam allowed to estimate also the resolution of the pixel sensor providing good results consistent with the pitch/sqrt(12) formula. The MAPS intrinsic resolution has been extracted from the width of the residual plot taking into account the multiple scattering effect.
Resumo:
Technology advances in recent years have dramatically changed the way users exploit contents and services available on the Internet, by enforcing pervasive and mobile computing scenarios and enabling access to networked resources almost from everywhere, at anytime, and independently of the device in use. In addition, people increasingly require to customize their experience, by exploiting specific device capabilities and limitations, inherent features of the communication channel in use, and interaction paradigms that significantly differ from the traditional request/response one. So-called Ubiquitous Internet scenario calls for solutions that address many different challenges, such as device mobility, session management, content adaptation, context-awareness and the provisioning of multimodal interfaces. Moreover, new service opportunities demand simple and effective ways to integrate existing resources into new and value added applications, that can also undergo run-time modifications, according to ever-changing execution conditions. Despite service-oriented architectural models are gaining momentum to tame the increasing complexity of composing and orchestrating distributed and heterogeneous functionalities, existing solutions generally lack a unified approach and only provide support for specific Ubiquitous Internet aspects. Moreover, they usually target rather static scenarios and scarcely support the dynamic nature of pervasive access to Internet resources, that can make existing compositions soon become obsolete or inadequate, hence in need of reconfiguration. This thesis proposes a novel middleware approach to comprehensively deal with Ubiquitous Internet facets and assist in establishing innovative application scenarios. We claim that a truly viable ubiquity support infrastructure must neatly decouple distributed resources to integrate and push any kind of content-related logic outside its core layers, by keeping only management and coordination responsibilities. Furthermore, we promote an innovative, open, and dynamic resource composition model that allows to easily describe and enforce complex scenario requirements, and to suitably react to changes in the execution conditions.
Resumo:
Many research fields are pushing the engineering of large-scale, mobile, and open systems towards the adoption of techniques inspired by self-organisation: pervasive computing, but also distributed artificial intelligence, multi-agent systems, social networks, peer-topeer and grid architectures exploit adaptive techniques to make global system properties emerge in spite of the unpredictability of interactions and behaviour. Such a trend is visible also in coordination models and languages, whenever a coordination infrastructure needs to cope with managing interactions in highly dynamic and unpredictable environments. As a consequence, self-organisation can be regarded as a feasible metaphor to define a radically new conceptual coordination framework. The resulting framework defines a novel coordination paradigm, called self-organising coordination, based on the idea of spreading coordination media over the network, and charge them with services to manage interactions based on local criteria, resulting in the emergence of desired and fruitful global coordination properties of the system. Features like topology, locality, time-reactiveness, and stochastic behaviour play a key role in both the definition of such a conceptual framework and the consequent development of self-organising coordination services. According to this framework, the thesis presents several self-organising coordination techniques developed during the PhD course, mainly concerning data distribution in tuplespace-based coordination systems. Some of these techniques have been also implemented in ReSpecT, a coordination language for tuple spaces, based on logic tuples and reactions to events occurring in a tuple space. In addition, the key role played by simulation and formal verification has been investigated, leading to analysing how automatic verification techniques like probabilistic model checking can be exploited in order to formally prove the emergence of desired behaviours when dealing with coordination approaches based on self-organisation. To this end, a concrete case study is presented and discussed.
Resumo:
Human reasoning is a fascinating and complex cognitive process that can be applied in different research areas such as philosophy, psychology, laws and financial. Unfortunately, developing supporting software (to those different areas) able to cope such as complex reasoning it’s difficult and requires a suitable logic abstract formalism. In this thesis we aim to develop a program, that has the job to evaluate a theory (a set of rules) w.r.t. a Goal, and provide some results such as “The Goal is derivable from the KB5 (of the theory)”. In order to achieve this goal we need to analyse different logics and choose the one that best meets our needs. In logic, usually, we try to determine if a given conclusion is logically implied by a set of assumptions T (theory). However, when we deal with programming logic we need an efficient algorithm in order to find such implications. In this work we use a logic rather similar to human logic. Indeed, human reasoning requires an extension of the first order logic able to reach a conclusion depending on not definitely true6 premises belonging to a incomplete set of knowledge. Thus, we implemented a defeasible logic7 framework able to manipulate defeasible rules. Defeasible logic is a non-monotonic logic designed for efficient defeasible reasoning by Nute (see Chapter 2). Those kind of applications are useful in laws area especially if they offer an implementation of an argumentation framework that provides a formal modelling of game. Roughly speaking, let the theory is the set of laws, a keyclaim is the conclusion that one of the party wants to prove (and the other one wants to defeat) and adding dynamic assertion of rules, namely, facts putted forward by the parties, then, we can play an argumentative challenge between two players and decide if the conclusion is provable or not depending on the different strategies performed by the players. Implementing a game model requires one more meta-interpreter able to evaluate the defeasible logic framework; indeed, according to Göedel theorem (see on page 127), we cannot evaluate the meaning of a language using the tools provided by the language itself, but we need a meta-language able to manipulate the object language8. Thus, rather than a simple meta-interpreter, we propose a Meta-level containing different Meta-evaluators. The former has been explained above, the second one is needed to perform the game model, and the last one will be used to change game execution and tree derivation strategies.
Resumo:
Traditional logic gates are rapidly reaching the limits of miniaturization. Overheating of these components is no longer negligible. A new physical approach to the machine was proposed by Prof. C S. Lent “Molecular Quantum cellular automata”. Indeed the quantum-dot cellular automata (QCA) approach offers an attractive alternative to diode or transistor devices. Th units encode binary information by two polarizations without corrent flow. The units for QCA theory are called QCA cells and can be realized in several way. Molecules can act as QCA cells at room temperature. In collaboration with STMicroelectronic, the group of Electrochemistry of Prof. Paolucci and the Nananotecnology laboratory from Lecce, we synthesized and studied with many techniques surface-active chiral bis-ferrocenes, conveniently designed in order to act as prototypical units for molecular computing devices. The chemistry of ferrocene has been studied thoroughly and found the opportunity to promote substitution reaction of a ferrocenyl alcohols with various nucleophiles without the aid of Lewis acid as catalysts. The only interaction between water and the two reagents is involve in the formation of a carbocation specie which is the true reactive species. We have generalized this concept to other benzyl alcohols which generating stabilized carbocations. Carbocation describe in Mayr’s scale were fondametal for our research. Finally, we used these alcohols to alkylate in enantioselective way aldehydes via organocatalysis.
Resumo:
The hierarchical organisation of biological systems plays a crucial role in the pattern formation of gene expression resulting from the morphogenetic processes, where autonomous internal dynamics of cells, as well as cell-to-cell interactions through membranes, are responsible for the emergent peculiar structures of the individual phenotype. Being able to reproduce the systems dynamics at different levels of such a hierarchy might be very useful for studying such a complex phenomenon of self-organisation. The idea is to model the phenomenon in terms of a large and dynamic network of compartments, where the interplay between inter-compartment and intra-compartment events determines the emergent behaviour resulting in the formation of spatial patterns. According to these premises the thesis proposes a review of the different approaches already developed in modelling developmental biology problems, as well as the main models and infrastructures available in literature for modelling biological systems, analysing their capabilities in tackling multi-compartment / multi-level models. The thesis then introduces a practical framework, MS-BioNET, for modelling and simulating these scenarios exploiting the potential of multi-level dynamics. This is based on (i) a computational model featuring networks of compartments and an enhanced model of chemical reaction addressing molecule transfer, (ii) a logic-oriented language to flexibly specify complex simulation scenarios, and (iii) a simulation engine based on the many-species/many-channels optimised version of Gillespie’s direct method. The thesis finally proposes the adoption of the agent-based model as an approach capable of capture multi-level dynamics. To overcome the problem of parameter tuning in the model, the simulators are supplied with a module for parameter optimisation. The task is defined as an optimisation problem over the parameter space in which the objective function to be minimised is the distance between the output of the simulator and a target one. The problem is tackled with a metaheuristic algorithm. As an example of application of the MS-BioNET framework and of the agent-based model, a model of the first stages of Drosophila Melanogaster development is realised. The model goal is to generate the early spatial pattern of gap gene expression. The correctness of the models is shown comparing the simulation results with real data of gene expression with spatial and temporal resolution, acquired in free on-line sources.
Resumo:
Wie viele andere Sprachen Ost- und Südostasiens ist das Thai eine numerusneutrale Sprache, in der ein Nomen lediglich das Konzept benennt und keinen Hinweis auf die Anzahl der Objekte liefert. Um Nomina im Thai zählen zu können, ist der Klassifikator (Klf) nötig, der die Objekte anhand ihrer semantischen Schlüsseleigenschaft herausgreift und individualisiert. Neben der Klassifikation stellt die Individualisierung die Hauptfunktion des Klf dar. Weitere Kernfunktionen des Klf außerhalb des Zählkontextes sind die Markierung der Definitheit, des Numerus sowie des Kontrasts. Die wichtigsten neuen Ergebnisse dieser Arbeit, die sowohl die Ebenen der Grammatik und Semantik als auch die der Logik und Pragmatik integriert, sind folgende: Im Thai kann der Klf sowohl auf der Element- als auch auf der Mengenebene agieren. In der Verbindung mit einem Demonstrativ kann der Klf auch eine pluralische Interpretation hervorrufen, wenn er auf eine als pluralisch präsupponierte Gesamtmenge referiert oder die Gesamtmenge in einer Teil-Ganzes-Relation individualisiert. In einem Ausdruck, der bereits eine explizite Zahlangabe enthält, bewirkt die Klf-Demonstrativ-Konstruktion eine Kontrastierung von Mengen mit gleichen Eigenschaften. Wie auch der Individualbegriff besitzt der Klf Intension und Extension. Intension und Extension von Thai-Klf verhalten sich umgekehrt proportional, d.h. je spezifischer der Inhalt eines Klf ist, desto kleiner ist sein Umfang. Der Klf signalisiert das Schlüsselmerkmal, das mit der Intension des Nomens der Identifizierung des Objekts dient. Der Klf individualisiert das Nomen, indem er Teilmengen quantifiziert. Er kann sich auf ein Objekt, eine bestimmte Anzahl von Objekten oder auf alle Objekte beziehen. Formal logisch lassen sich diese Funktionen mithilfe des Existenz- und des Allquantors darstellen. Auch die Nullstelle (NST) läßt sich formal logisch darstellen. Auf ihren jeweiligen Informationsgehalt reduziert, ergeben sich für Klf und NST abhängig von ihrer Positionierung verschiedene Informationswerte: Die Opposition von Klf und NST bewirkt in den Fragebögen ausschließlich skalare Q-Implikaturen, die sich durch die Informationsformeln in Form einer Horn-Skala darstellen lassen. In einem sich aufbauenden Kontext transportieren sowohl Klf als auch NST in der Kontextmitte bekannte Informationen, wodurch Implikaturen des M- bzw. I-Prinzips ausgelöst werden. Durch die Verbindung der Informationswerte mit den Implikaturen des Q-, M- und I-Prinzips lässt sich anhand der Positionierung direkt erkennen, wann der Klf die Funktion der Numerus-, der Definitheits- oder der Kontrast-Markierung erfüllt.
Resumo:
1. Teil: Bekannte Konstruktionen. Die vorliegende Arbeit gibt zunächst einen ausführlichen Überblick über die bisherigen Entwicklungen auf dem klassischen Gebiet der Hyperflächen mit vielen Singularitäten. Die maximale Anzahl mu^n(d) von Singularitäten auf einer Hyperfläche vom Grad d im P^n(C) ist nur in sehr wenigen Fällen bekannt, im P^3(C) beispielsweise nur für d<=6. Abgesehen von solchen Ausnahmen existieren nur obere und untere Schranken. 2. Teil: Neue Konstruktionen. Für kleine Grade d ist es oft möglich, bessere Resultate zu erhalten als jene, die durch allgemeine Schranken gegeben sind. In dieser Arbeit beschreiben wir einige algorithmische Ansätze hierfür, von denen einer Computer Algebra in Charakteristik 0 benutzt. Unsere anderen algorithmischen Methoden basieren auf einer Suche über endlichen Körpern. Das Liften der so experimentell gefundenen Hyperflächen durch Ausnutzung ihrer Geometrie oder Arithmetik liefert beispielsweise eine Fläche vom Grad 7 mit $99$ reellen gewöhnlichen Doppelpunkten und eine Fläche vom Grad 9 mit 226 gewöhnlichen Doppelpunkten. Diese Konstruktionen liefern die ersten unteren Schranken für mu^3(d) für ungeraden Grad d>5, die die allgemeine Schranke übertreffen. Unser Algorithmus hat außerdem das Potential, auf viele weitere Probleme der algebraischen Geometrie angewendet zu werden. Neben diesen algorithmischen Methoden beschreiben wir eine Konstruktion von Hyperflächen vom Grad d im P^n mit vielen A_j-Singularitäten, j>=2. Diese Beispiele, deren Existenz wir mit Hilfe der Theorie der Dessins d'Enfants beweisen, übertreffen die bekannten unteren Schranken in den meisten Fällen und ergeben insbesondere neue asymptotische untere Schranken für j>=2, n>=3. 3. Teil: Visualisierung. Wir beschließen unsere Arbeit mit einer Anwendung unserer neuen Visualisierungs-Software surfex, die die Stärken mehrerer existierender Programme bündelt, auf die Konstruktion affiner Gleichungen aller 45 topologischen Typen reeller kubischer Flächen.
Resumo:
Synthetic biology has recently had a great development, many papers have been published and many applications have been presented, spanning from the production of biopharmacheuticals to the synthesis of bioenergetic substrates or industrial catalysts. But, despite these advances, most of the applications are quite simple and don’t fully exploit the potential of this discipline. This limitation in complexity has many causes, like the incomplete characterization of some components, or the intrinsic variability of the biological systems, but one of the most important reasons is the incapability of the cell to sustain the additional metabolic burden introduced by a complex circuit. The objective of the project, of which this work is part, is trying to solve this problem through the engineering of a multicellular behaviour in prokaryotic cells. This system will introduce a cooperative behaviour that will allow to implement complex functionalities, that can’t be obtained with a single cell. In particular the goal is to implement the Leader Election, this procedure has been firstly devised in the field of distributed computing, to identify the process that allow to identify a single process as organizer and coordinator of a series of tasks assigned to the whole population. The election of the Leader greatly simplifies the computation providing a centralized control. Further- more this system may even be useful to evolutionary studies that aims to explain how complex organisms evolved from unicellular systems. The work presented here describes, in particular, the design and the experimental characterization of a component of the circuit that solves the Leader Election problem. This module, composed of an hybrid promoter and a gene, is activated in the non-leader cells after receiving the signal that a leader is present in the colony. The most important element, in this case, is the hybrid promoter, it has been realized in different versions, applying the heuristic rules stated in [22], and their activity has been experimentally tested. The objective of the experimental characterization was to test the response of the genetic circuit to the introduction, in the cellular environment, of particular molecules, inducers, that can be considered inputs of the system. The desired behaviour is similar to the one of a logic AND gate in which the exit, represented by the luminous signal produced by a fluorescent protein, is one only in presence of both inducers. The robustness and the stability of this behaviour have been tested by changing the concentration of the input signals and building dose response curves. From these data it is possible to conclude that the analysed constructs have an AND-like behaviour over a wide range of inducers’ concentrations, even if it is possible to identify many differences in the expression profiles of the different constructs. This variability accounts for the fact that the input and the output signals are continuous, and so their binary representation isn’t able to capture the complexity of the behaviour. The module of the circuit that has been considered in this analysis has a fundamental role in the realization of the intercellular communication system that is necessary for the cooperative behaviour to take place. For this reason, the second phase of the characterization has been focused on the analysis of the signal transmission. In particular, the interaction between this element and the one that is responsible for emitting the chemical signal has been tested. The desired behaviour is still similar to a logic AND, since, even in this case, the exit signal is determined by the hybrid promoter activity. The experimental results have demonstrated that the systems behave correctly, even if there is still a substantial variability between them. The dose response curves highlighted that stricter constrains on the inducers concentrations need to be imposed in order to obtain a clear separation between the two levels of expression. In the conclusive chapter the DNA sequences of the hybrid promoters are analysed, trying to identify the regulatory elements that are most important for the determination of the gene expression. Given the available data it wasn’t possible to draw definitive conclusions. In the end, few considerations on promoter engineering and complex circuits realization are presented. This section aims to briefly recall some of the problems outlined in the introduction and provide a few possible solutions.
Resumo:
L’idea fondamentale da cui prende avvio la presente tesi di dottorato è che sia possibile parlare di una svolta nel modo di concettualizzare e implementare le politiche sociali, il cui fuoco diviene sempre più la costruzione di reti di partnership fra attori pubblici e privati, in cui una serie di soggetti sociali plurimi (stakeholders) attivano fra loro una riflessività relazionale. L’ipotesi generale della ricerca è che, dopo le politiche improntate a modelli statalisti e mercatisti, o un loro mix, nella politica sociale italiana emerga l’esigenza di una svolta riflessiva e relazionale, verso un modello societario, sussidiario e plurale, e che di fatto – specie a livello locale – stiano sorgendo molte iniziative in tal senso. Una delle idee più promettenti sembra essere la creazione di distretti sociali per far collaborare tra loro attori pubblici, privati e di Terzo settore al fine di creare forme innovative di servizi per la famiglia e la persona. La presente tesi si focalizza sul tentativo della Provincia di Trento di distrettualizzare le politiche per la famiglia. Tramite l’analisi del progetto “Trentino – Territorio Amico della Famiglia” e di una sua verticalizzazione, il Distretto Famiglia, si è studiato l’apporto delle partnership pubblico-privato nella formazione di strumenti innovativi di governance che possano determinare una svolta morfogenetica nell’elaborazione di politiche per la famiglia. Le conclusioni del lavoro, attraverso una comparazione tra esperienze territoriali, presentano la differenziazione delle partnership sociali, in base ad alcuni variabili (pluralità di attori, pluralità di risorse, shared project, capitale sociale, decision making, mutual action, logiche di lavoro relazionale, sussidiarietà). Le diverse modalità di gestione delle partnership (capacitante, professionale e generativa) sintetizzano i portati culturali, strutturali e personali coinvolti nelle singole costruzioni. Solo le partnership che interpretano il loro potenziale regolativo e promozionale secondo la riflessività relazionale tendono a generare beni comuni nel contesto sociale.
Resumo:
Analysts, politicians and international players from all over the world look at China as one of the most powerful countries on the international scenario, and as a country whose economic development can significantly impact on the economies of the rest of the world. However many aspects of this country have still to be investigated. First the still fundamental role played by Chinese rural areas for the general development of the country from a political, economic and social point of view. In particular, the way in which the rural areas have influenced the social stability of the whole country has been widely discussed due to their strict relationship with the urban areas where most people from the countryside emigrate searching for a job and a better life. In recent years many studies have mostly focused on the urbanization phenomenon with little interest in the living conditions in rural areas and in the deep changes which have occurred in some, mainly agricultural provinces. An analysis of the level of infrastructure is one of the main aspects which highlights the principal differences in terms of living conditions between rural and urban areas. In this thesis, I first carried out the analysis through the multivariate statistics approach (Principal Component Analysis and Cluster Analysis) in order to define the new map of rural areas based on the analysis of living conditions. In the second part I elaborated an index (Living Conditions Index) through the Fuzzy Expert/Inference System. Finally I compared this index (LCI) to the results obtained from the cluster analysis drawing geographic maps. The data source is the second national agricultural census of China carried out in 2006. In particular, I analysed the data refer to villages but aggregated at province level.
Resumo:
Implicazioni tettoniche ed estetiche delle logiche monoscocca integrate e stress lines analysis in architettura.
Resumo:
Over the last 60 years, computers and software have favoured incredible advancements in every field. Nowadays, however, these systems are so complicated that it is difficult – if not challenging – to understand whether they meet some requirement or are able to show some desired behaviour or property. This dissertation introduces a Just-In-Time (JIT) a posteriori approach to perform the conformance check to identify any deviation from the desired behaviour as soon as possible, and possibly apply some corrections. The declarative framework that implements our approach – entirely developed on the promising open source forward-chaining Production Rule System (PRS) named Drools – consists of three components: 1. a monitoring module based on a novel, efficient implementation of Event Calculus (EC), 2. a general purpose hybrid reasoning module (the first of its genre) merging temporal, semantic, fuzzy and rule-based reasoning, 3. a logic formalism based on the concept of expectations introducing Event-Condition-Expectation rules (ECE-rules) to assess the global conformance of a system. The framework is also accompanied by an optional module that provides Probabilistic Inductive Logic Programming (PILP). By shifting the conformance check from after execution to just in time, this approach combines the advantages of many a posteriori and a priori methods proposed in literature. Quite remarkably, if the corrective actions are explicitly given, the reactive nature of this methodology allows to reconcile any deviations from the desired behaviour as soon as it is detected. In conclusion, the proposed methodology brings some advancements to solve the problem of the conformance checking, helping to fill the gap between humans and the increasingly complex technology.