10 resultados para Fine-grained
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Matita (that means pencil in Italian) is a new interactive theorem prover under development at the University of Bologna. When compared with state-of-the-art proof assistants, Matita presents both traditional and innovative aspects. The underlying calculus of the system, namely the Calculus of (Co)Inductive Constructions (CIC for short), is well-known and is used as the basis of another mainstream proof assistant—Coq—with which Matita is to some extent compatible. In the same spirit of several other systems, proof authoring is conducted by the user as a goal directed proof search, using a script for storing textual commands for the system. In the tradition of LCF, the proof language of Matita is procedural and relies on tactic and tacticals to proceed toward proof completion. The interaction paradigm offered to the user is based on the script management technique at the basis of the popularity of the Proof General generic interface for interactive theorem provers: while editing a script the user can move forth the execution point to deliver commands to the system, or back to retract (or “undo”) past commands. Matita has been developed from scratch in the past 8 years by several members of the Helm research group, this thesis author is one of such members. Matita is now a full-fledged proof assistant with a library of about 1.000 concepts. Several innovative solutions spun-off from this development effort. This thesis is about the design and implementation of some of those solutions, in particular those relevant for the topic of user interaction with theorem provers, and of which this thesis author was a major contributor. Joint work with other members of the research group is pointed out where needed. The main topics discussed in this thesis are briefly summarized below. Disambiguation. Most activities connected with interactive proving require the user to input mathematical formulae. Being mathematical notation ambiguous, parsing formulae typeset as mathematicians like to write down on paper is a challenging task; a challenge neglected by several theorem provers which usually prefer to fix an unambiguous input syntax. Exploiting features of the underlying calculus, Matita offers an efficient disambiguation engine which permit to type formulae in the familiar mathematical notation. Step-by-step tacticals. Tacticals are higher-order constructs used in proof scripts to combine tactics together. With tacticals scripts can be made shorter, readable, and more resilient to changes. Unfortunately they are de facto incompatible with state-of-the-art user interfaces based on script management. Such interfaces indeed do not permit to position the execution point inside complex tacticals, thus introducing a trade-off between the usefulness of structuring scripts and a tedious big step execution behavior during script replaying. In Matita we break this trade-off with tinycals: an alternative to a subset of LCF tacticals which can be evaluated in a more fine-grained manner. Extensible yet meaningful notation. Proof assistant users often face the need of creating new mathematical notation in order to ease the use of new concepts. The framework used in Matita for dealing with extensible notation both accounts for high quality bidimensional rendering of formulae (with the expressivity of MathMLPresentation) and provides meaningful notation, where presentational fragments are kept synchronized with semantic representation of terms. Using our approach interoperability with other systems can be achieved at the content level, and direct manipulation of formulae acting on their rendered forms is possible too. Publish/subscribe hints. Automation plays an important role in interactive proving as users like to delegate tedious proving sub-tasks to decision procedures or external reasoners. Exploiting the Web-friendliness of Matita we experimented with a broker and a network of web services (called tutors) which can try independently to complete open sub-goals of a proof, currently being authored in Matita. The user receives hints from the tutors on how to complete sub-goals and can interactively or automatically apply them to the current proof. Another innovative aspect of Matita, only marginally touched by this thesis, is the embedded content-based search engine Whelp which is exploited to various ends, from automatic theorem proving to avoiding duplicate work for the user. We also discuss the (potential) reusability in other systems of the widgets presented in this thesis and how we envisage the evolution of user interfaces for interactive theorem provers in the Web 2.0 era.
Resumo:
A multidisciplinary study was carried out on the Late Quaternary-Holocene subsurface deposits of two Mediterranean coastal areas: Arno coastal plain (Northern Tyrrhenian Sea) and Modern Po Delta (Northern Adriatic Sea). Detailed facies analyses, including sedimentological and micropalaeontological (benthic foraminifers and ostracods) investigations, were performed on nine continuously-cored boreholes of variable depth (ca. from 30 meters to100 meters). Six cores were located in the Arno coastal plain and three cores in the Modern Po Delta. To provide an accurate chronological framework, twenty-four organic-rich samples were collected along the fossil successions for radiocarbon dating (AMS 14C). In order to reconstruct the depositional and palaeoenvironmental evolution of the study areas, core data were combined with selected well logs, provided by local companies, along several stratigraphic sections. These sections revealed the presence of a transgressive-regressive (T-R) sequence, composing of continental, coastal and shallow-marine deposits dated to the Late Pleistocene-Holocene period, beneath the Arno coastal plain and the Modern Po Delta. Above the alluvial deposits attributed to the last glacial period, the post-glacial transgressive succession (TST) consists of back-barrier, transgressive barrier and inner shelf deposits. Peak of transgression (MFS) took place around the Late-Middle Holocene transition and was identified by subtle micropalaeontological indicators within undifferentiated fine-grained deposits. Upward a thick prograding succession (HST) records the turnaround to regressive conditions that led to a rapid delta progradation in both study areas. Particularly, the outbuilding of modern-age Po Delta coincides with mud-belt formation during the late HST (ca. 600 cal yr BP), as evidenced by a fossil microfauna similar to the foraminiferal assemblage observed in the present Northern Adriatic mud-belt. A complex interaction between allocyclic and autocyclic factors controlled facies evolution during the highstand period. The presence of local parameters and the absence of a predominant factor prevent from discerning or quantifying consequences of the complex relationships between climate and deltaic evolution. On the contrary transgressive sedimentation seems to be mainly controlled by two allocyclic key factors, sea-level rise and climate variability, that minimized the effects of local parameters on coastal palaeoenvironments. TST depositional architecture recorded in both study areas reflects a well-known millennial-scale variability of sea-level rising trend and climate during the Late glacial-Holocene period. Repeated phases of backswamp development and infilling by crevasse processes (parasequences) were recorded in the subsurface of Modern Po Delta during the early stages of transgression (ca. 11,000-9,500 cal yr BP). In the Arno coastal plain the presence of a deep-incised valley system, probably formed at OSI 3/2 transition, led to the development of a thick (ca. 35-40 m) transgressive succession composed of coastal plain, bay-head delta and estuarine deposits dated to the Last glacial-Early Holocene period. Within the transgressive valley fill sequence, high-resolution facies analyses allowed the identification and lateral tracing of three parasequences of millennial duration. The parasequences, ca. 8-12 meters thick, are bounded by flooding surfaces and show a typical internal shallowing-upward trend evidenced by subtle micropalaeontological investigations. The vertical stacking pattern of parasequences shows a close affinity with the step-like sea-level rising trend occurred between 14,000-8,000 cal years BP. Episodes of rapid sea-level rise and subsequent stillstand phases were paralleled by changes in climatic conditions, as suggested by pollen analyses performed on a core drilled in the proximal section of the Arno palaeovalley (pollen analyses performed by Dr. Marianna Ricci Lucchi). Rapid shifts to warmer climate conditions accompanied episodes of rapid sea-level rise, in contrast stillstand phases occurred during temporary colder climate conditions. For the first time the palaeoclimatic signature of high frequency depositional cycles is clearly documented. Moreover, two of the three "regressive" pulsations, recorded at the top of parasequences by episodes of partial estuary infilling in the proximal and central portions of Arno palaeovalley, may be correlated with the most important cold events of the post-glacial period: Younger Dryas and 8,200 cal yr BP event. The stratigraphic and palaeoclimatic data of Arno coastal plain and Po Delta were compared with those reported for the most important deltaic and coastal systems in the worldwide literature. The depositional architecture of transgressive successions reflects the strong influence of millennial-scale eustatic and climatic variability on worldwide coastal sedimentation during the Late glacial-Holocene period (ca. 14,000-7,000 cal yr BP). The most complete and accurate record of high-frequency eustatic and climatic events are usually found within the transgressive succession of very high accommodation settings, such as incised-valley systems where exceptionally thick packages of Late glacial-Early Holocene deposits are preserved.
Resumo:
The aim of this thesis was to investigate the respective contribution of prior information and sensorimotor constraints to action understanding, and to estimate their consequences on the evolution of human social learning. Even though a huge amount of literature is dedicated to the study of action understanding and its role in social learning, these issues are still largely debated. Here, I critically describe two main perspectives. The first perspective interprets faithful social learning as an outcome of a fine-grained representation of others’ actions and intentions that requires sophisticated socio-cognitive skills. In contrast, the second perspective highlights the role of simpler decision heuristics, the recruitment of which is determined by individual and ecological constraints. The present thesis aims to show, through four experimental works, that these two contributions are not mutually exclusive. A first study investigates the role of the inferior frontal cortex (IFC), the anterior intraparietal area (AIP) and the primary somatosensory cortex (S1) in the recognition of other people’s actions, using a transcranial magnetic stimulation adaptation paradigm (TMSA). The second work studies whether, and how, higher-order and lower-order prior information (acquired from the probabilistic sampling of past events vs. derived from an estimation of biomechanical constraints of observed actions) interacts during the prediction of other people’s intentions. Using a single-pulse TMS procedure, the third study investigates whether the interaction between these two classes of priors modulates the motor system activity. The fourth study tests the extent to which behavioral and ecological constraints influence the emergence of faithful social learning strategies at a population level. The collected data contribute to elucidate how higher-order and lower-order prior expectations interact during action prediction, and clarify the neural mechanisms underlying such interaction. Finally, these works provide/open promising perspectives for a better understanding of social learning, with possible extensions to animal models.
Resumo:
In distributed systems like clouds or service oriented frameworks, applications are typically assembled by deploying and connecting a large number of heterogeneous software components, spanning from fine-grained packages to coarse-grained complex services. The complexity of such systems requires a rich set of techniques and tools to support the automation of their deployment process. By relying on a formal model of components, a technique is devised for computing the sequence of actions allowing the deployment of a desired configuration. An efficient algorithm, working in polynomial time, is described and proven to be sound and complete. Finally, a prototype tool implementing the proposed algorithm has been developed. Experimental results support the adoption of this novel approach in real life scenarios.
Resumo:
People tend to automatically mimic facial expressions of others. If clear evidence exists on the effect of non-verbal behavior (emotion faces) on automatic facial mimicry, little is known about the role of verbal behavior (emotion language) in triggering such effects. Whereas it is well-established that political affiliation modulates facial mimicry, no evidence exists on whether this modulation passes also through verbal means. This research addressed the role of verbal behavior in triggering automatic facial effects depending on whether verbal stimuli are attributed to leaders of different political parties. Study 1 investigated the role of interpersonal verbs, referring to positive and negative emotion expressions and encoding them at different levels of abstraction, in triggering corresponding facial muscle activation in a reader. Study 2 examined the role of verbs expressing positive and negative emotional behaviors of political leaders in modulating automatic facial effects depending on the matched or mismatched political affiliation of participants and politicians of left-and right-wing. Study 3 examined whether verbs expressing happiness displays of ingroup politicians induce a more sincere smile (Duchenne) pattern among readers of same political affiliation relative to happiness expressions of outgroup politicians. Results showed that verbs encoding facial actions at different levels of abstraction elicited differential facial muscle activity (Study 1). Furthermore, political affiliation significantly modulated facial activation triggered by emotion verbs as participants showed more congruent and enhanced facial activity towards ingroup politicians’ smiles and frowns compared to those of outgroup politicians (Study 2). Participants facially responded with a more sincere smile pattern towards verbs expressing smiles of ingroup compared to outgroup politicians (Study 3). Altogether, results showed that the role of political affiliation in modulating automatic facial effects passes also through verbal channels and is revealed at a fine-grained level by inducing quantitative and qualitative differences in automatic facial reactions of readers.
Resumo:
Network monitoring is of paramount importance for effective network management: it allows to constantly observe the network’s behavior to ensure it is working as intended and can trigger both automated and manual remediation procedures in case of failures and anomalies. The concept of SDN decouples the control logic from legacy network infrastructure to perform centralized control on multiple switches in the network, and in this context, the responsibility of switches is only to forward packets according to the flow control instructions provided by controller. However, as current SDN switches only expose simple per-port and per-flow counters, the controller has to do almost all the processing to determine the network state, which causes significant communication overhead and excessive latency for monitoring purposes. The absence of programmability in the data plane of SDN prompted the advent of programmable switches, which allow developers to customize the data-plane pipeline and implement novel programs operating directly in the switches. This means that we can offload certain monitoring tasks to programmable data planes, to perform fine-grained monitoring even at very high packet processing speeds. Given the central importance of network monitoring exploiting programmable data planes, the goal of this thesis is to enable a wide range of monitoring tasks in programmable switches, with a specific focus on the ones equipped with programmable ASICs. Indeed, most network monitoring solutions available in literature do not take computational and memory constraints of programmable switches into due account, preventing, de facto, their successful implementation in commodity switches. This claims that network monitoring tasks can be executed in programmable switches. Our evaluations show that the contributions in this thesis could be used by network administrators as well as network security engineers, to better understand the network status depending on different monitoring metrics, and thus prevent network infrastructure and service outages.
Resumo:
This Thesis is composed of a collection of works written in the period 2019-2022, whose aim is to find methodologies of Artificial Intelligence (AI) and Machine Learning to detect and classify patterns and rules in argumentative and legal texts. We define our approach “hybrid”, since we aimed at designing hybrid combinations of symbolic and sub-symbolic AI, involving both “top-down” structured knowledge and “bottom-up” data-driven knowledge. A first group of works is dedicated to the classification of argumentative patterns. Following the Waltonian model of argument and the related theory of Argumentation Schemes, these works focused on the detection of argumentative support and opposition, showing that argumentative evidences can be classified at fine-grained levels without resorting to highly engineered features. To show this, our methods involved not only traditional approaches such as TFIDF, but also some novel methods based on Tree Kernel algorithms. After the encouraging results of this first phase, we explored the use of a some emerging methodologies promoted by actors like Google, which have deeply changed NLP since 2018-19 — i.e., Transfer Learning and language models. These new methodologies markedly improved our previous results, providing us with best-performing NLP tools. Using Transfer Learning, we also performed a Sequence Labelling task to recognize the exact span of argumentative components (i.e., claims and premises), thus connecting portions of natural language to portions of arguments (i.e., to the logical-inferential dimension). The last part of our work was finally dedicated to the employment of Transfer Learning methods for the detection of rules and deontic modalities. In this case, we explored a hybrid approach which combines structured knowledge coming from two LegalXML formats (i.e., Akoma Ntoso and LegalRuleML) with sub-symbolic knowledge coming from pre-trained (and then fine-tuned) neural architectures.
Resumo:
In recent years, there has been exponential growth in using virtual spaces, including dialogue systems, that handle personal information. The concept of personal privacy in the literature is discussed and controversial, whereas, in the technological field, it directly influences the degree of reliability perceived in the information system (privacy ‘as trust’). This work aims to protect the right to privacy on personal data (GDPR, 2018) and avoid the loss of sensitive content by exploring sensitive information detection (SID) task. It is grounded on the following research questions: (RQ1) What does sensitive data mean? How to define a personal sensitive information domain? (RQ2) How to create a state-of-the-art model for SID?(RQ3) How to evaluate the model? RQ1 theoretically investigates the concepts of privacy and the ontological state-of-the-art representation of personal information. The Data Privacy Vocabulary (DPV) is the taxonomic resource taken as an authoritative reference for the definition of the knowledge domain. Concerning RQ2, we investigate two approaches to classify sensitive data: the first - bottom-up - explores automatic learning methods based on transformer networks, the second - top-down - proposes logical-symbolic methods with the construction of privaframe, a knowledge graph of compositional frames representing personal data categories. Both approaches are tested. For the evaluation - RQ3 – we create SPeDaC, a sentence-level labeled resource. This can be used as a benchmark or training in the SID task, filling the gap of a shared resource in this field. If the approach based on artificial neural networks confirms the validity of the direction adopted in the most recent studies on SID, the logical-symbolic approach emerges as the preferred way for the classification of fine-grained personal data categories, thanks to the semantic-grounded tailor modeling it allows. At the same time, the results highlight the strong potential of hybrid architectures in solving automatic tasks.
Resumo:
The application of modern ICT technologies is radically changing many fields pushing toward more open and dynamic value chains fostering the cooperation and integration of many connected partners, sensors, and devices. As a valuable example, the emerging Smart Tourism field derived from the application of ICT to Tourism so to create richer and more integrated experiences, making them more accessible and sustainable. From a technological viewpoint, a recurring challenge in these decentralized environments is the integration of heterogeneous services and data spanning multiple administrative domains, each possibly applying different security/privacy policies, device and process control mechanisms, service access, and provisioning schemes, etc. The distribution and heterogeneity of those sources exacerbate the complexity in the development of integrating solutions with consequent high effort and costs for partners seeking them. Taking a step towards addressing these issues, we propose APERTO, a decentralized and distributed architecture that aims at facilitating the blending of data and services. At its core, APERTO relies on APERTO FaaS, a Serverless platform allowing fast prototyping of the business logic, lowering the barrier of entry and development costs to newcomers, (zero) fine-grained scaling of resources servicing end-users, and reduced management overhead. APERTO FaaS infrastructure is based on asynchronous and transparent communications between the components of the architecture, allowing the development of optimized solutions that exploit the peculiarities of distributed and heterogeneous environments. In particular, APERTO addresses the provisioning of scalable and cost-efficient mechanisms targeting: i) function composition allowing the definition of complex workloads from simple, ready-to-use functions, enabling smarter management of complex tasks and improved multiplexing capabilities; ii) the creation of end-to-end differentiated QoS slices minimizing interfaces among application/service running on a shared infrastructure; i) an abstraction providing uniform and optimized access to heterogeneous data sources, iv) a decentralized approach for the verification of access rights to resources.