897 resultados para Abstraction.
Resumo:
Il lavoro è una riflessione sugli sviluppi della nozione di definizione nel recente dibattito sull'analiticità. La rinascita di questa discussione, dopo le critiche di Quine e un conseguente primo abbandono della concezione convenzionalista carnapiana ha come conseguenza una nuova concezione epistemica dell'analiticità. Nella maggior parte dei casi le nuove teorie epistemiche, tra le quali quelle di Bob Hale e Crispin Wright (Implicit Definition and the A priori, 2001) e Paul Boghossian (Analyticity, 1997; Epistemic analyticity, a defence, 2002, Blind reasoning, 2003, Is Meaning Normative ?, 2005) presentano il comune carattere di intendere la conoscenza a priori nella forma di una definizione implicita (Paul Horwich, Stipulation, Meaning, and Apriority, 2001). Ma una seconda linea di obiezioni facenti capo dapprima a Horwich, e in seguito agli stessi Hale e Wright, mettono in evidenza rispettivamente due difficoltà per la definizione corrispondenti alle questioni dell'arroganza epistemica e dell'accettazione (o della stipulazione) di una definizione implicita. Da questo presupposto nascono diversi tentativi di risposta. Da un lato, una concezione della definizione, nella teoria di Hale e Wright, secondo la quale essa appare come un principio di astrazione, dall'altro una nozione della definizione come definizione implicita, che si richiama alla concezione di P. Boghossian. In quest'ultima, la definizione implicita è data nella forma di un condizionale linguistico (EA, 2002; BR, 2003), ottenuto mediante una fattorizzazione della teoria costruita sul modello carnapiano per i termini teorici delle teorie empiriche. Un'analisi attenta del lavoro di Rudolf Carnap (Philosophical foundations of Physics, 1966), mostra che la strategia di scomposizione rappresenta una strada possibile per una nozione di analiticità adeguata ai termini teorici. La strategia carnapiana si colloca, infatti, nell'ambito di un tentativo di elaborazione di una nozione di analiticità che tiene conto degli aspetti induttivi delle teorie empiriche
Resumo:
Complex Networks analysis turn out to be a very promising field of research, testified by many research projects and works that span different fields. Those analysis have been usually focused on characterize a single aspect of the system and a study that considers many informative axes along with a network evolve is lacking. We propose a new multidimensional analysis that is able to inspect networks in the two most important dimensions, space and time. To achieve this goal, we studied them singularly and investigated how the variation of the constituting parameters drives changes to the network as a whole. By focusing on space dimension, we characterized spatial alteration in terms of abstraction levels. We proposed a novel algorithm that, by applying a fuzziness function, can reconstruct networks under different level of details. We verified that statistical indicators depend strongly on the granularity with which a system is described and on the class of networks. We keep fixed the space axes and we isolated the dynamics behind networks evolution process. We detected new instincts that trigger social networks utilization and spread the adoption of novel communities. We formalized this enhanced social network evolution by adopting special nodes (called sirens) that, thanks to their ability to attract new links, were able to construct efficient connection patterns. We simulated the dynamics of the system by considering three well-known growth models. Applying this framework to real and synthetic networks, we showed that the sirens, even when used for a limited time span, effectively shrink the time needed to get a network in mature state. In order to provide a concrete context of our findings, we formalized the cost of setting up such enhancement and provided the best combinations of system's parameters, such as number of sirens, time span of utilization and attractiveness.
Resumo:
Mainstream hardware is becoming parallel, heterogeneous, and distributed on every desk, every home and in every pocket. As a consequence, in the last years software is having an epochal turn toward concurrency, distribution, interaction which is pushed by the evolution of hardware architectures and the growing of network availability. This calls for introducing further abstraction layers on top of those provided by classical mainstream programming paradigms, to tackle more effectively the new complexities that developers have to face in everyday programming. A convergence it is recognizable in the mainstream toward the adoption of the actor paradigm as a mean to unite object-oriented programming and concurrency. Nevertheless, we argue that the actor paradigm can only be considered a good starting point to provide a more comprehensive response to such a fundamental and radical change in software development. Accordingly, the main objective of this thesis is to propose Agent-Oriented Programming (AOP) as a high-level general purpose programming paradigm, natural evolution of actors and objects, introducing a further level of human-inspired concepts for programming software systems, meant to simplify the design and programming of concurrent, distributed, reactive/interactive programs. To this end, in the dissertation first we construct the required background by studying the state-of-the-art of both actor-oriented and agent-oriented programming, and then we focus on the engineering of integrated programming technologies for developing agent-based systems in their classical application domains: artificial intelligence and distributed artificial intelligence. Then, we shift the perspective moving from the development of intelligent software systems, toward general purpose software development. Using the expertise maturated during the phase of background construction, we introduce a general-purpose programming language named simpAL, which founds its roots on general principles and practices of software development, and at the same time provides an agent-oriented level of abstraction for the engineering of general purpose software systems.
Resumo:
Modern embedded systems embrace many-core shared-memory designs. Due to constrained power and area budgets, most of them feature software-managed scratchpad memories instead of data caches to increase the data locality. It is therefore programmers’ responsibility to explicitly manage the memory transfers, and this make programming these platform cumbersome. Moreover, complex modern applications must be adequately parallelized before they can the parallel potential of the platform into actual performance. To support this, programming languages were proposed, which work at a high level of abstraction, and rely on a runtime whose cost hinders performance, especially in embedded systems, where resources and power budget are constrained. This dissertation explores the applicability of the shared-memory paradigm on modern many-core systems, focusing on the ease-of-programming. It focuses on OpenMP, the de-facto standard for shared memory programming. In a first part, the cost of algorithms for synchronization and data partitioning are analyzed, and they are adapted to modern embedded many-cores. Then, the original design of an OpenMP runtime library is presented, which supports complex forms of parallelism such as multi-level and irregular parallelism. In the second part of the thesis, the focus is on heterogeneous systems, where hardware accelerators are coupled to (many-)cores to implement key functional kernels with orders-of-magnitude of speedup and energy efficiency compared to the “pure software” version. However, three main issues rise, namely i) platform design complexity, ii) architectural scalability and iii) programmability. To tackle them, a template for a generic hardware processing unit (HWPU) is proposed, which share the memory banks with cores, and the template for a scalable architecture is shown, which integrates them through the shared-memory system. Then, a full software stack and toolchain are developed to support platform design and to let programmers exploiting the accelerators of the platform. The OpenMP frontend is extended to interact with it.
Resumo:
People tend to automatically mimic facial expressions of others. If clear evidence exists on the effect of non-verbal behavior (emotion faces) on automatic facial mimicry, little is known about the role of verbal behavior (emotion language) in triggering such effects. Whereas it is well-established that political affiliation modulates facial mimicry, no evidence exists on whether this modulation passes also through verbal means. This research addressed the role of verbal behavior in triggering automatic facial effects depending on whether verbal stimuli are attributed to leaders of different political parties. Study 1 investigated the role of interpersonal verbs, referring to positive and negative emotion expressions and encoding them at different levels of abstraction, in triggering corresponding facial muscle activation in a reader. Study 2 examined the role of verbs expressing positive and negative emotional behaviors of political leaders in modulating automatic facial effects depending on the matched or mismatched political affiliation of participants and politicians of left-and right-wing. Study 3 examined whether verbs expressing happiness displays of ingroup politicians induce a more sincere smile (Duchenne) pattern among readers of same political affiliation relative to happiness expressions of outgroup politicians. Results showed that verbs encoding facial actions at different levels of abstraction elicited differential facial muscle activity (Study 1). Furthermore, political affiliation significantly modulated facial activation triggered by emotion verbs as participants showed more congruent and enhanced facial activity towards ingroup politicians’ smiles and frowns compared to those of outgroup politicians (Study 2). Participants facially responded with a more sincere smile pattern towards verbs expressing smiles of ingroup compared to outgroup politicians (Study 3). Altogether, results showed that the role of political affiliation in modulating automatic facial effects passes also through verbal channels and is revealed at a fine-grained level by inducing quantitative and qualitative differences in automatic facial reactions of readers.
Resumo:
La programmazione aggregata è un paradigma che supporta la programmazione di sistemi di dispositivi, adattativi ed eventualmente a larga scala, nel loro insieme -- come aggregati. L'approccio prevalente in questo contesto è basato sul field calculus, un calcolo formale che consente di definire programmi aggregati attraverso la composizione funzionale di campi computazionali, creando i presupposti per la specifica di pattern di auto-organizzazione robusti. La programmazione aggregata è attualmente supportata, in modo più o meno parziale e principalmente per la simulazione, da DSL dedicati (cf., Protelis), ma non esistono framework per linguaggi mainstream finalizzati allo sviluppo di applicazioni. Eppure, un simile supporto sarebbe auspicabile per ridurre tempi e sforzi d'adozione e per semplificare l'accesso al paradigma nella costruzione di sistemi reali, nonché per favorire la ricerca stessa nel campo. Il presente lavoro consiste nello sviluppo, a partire da un prototipo della semantica operazionale del field calculus, di un framework per la programmazione aggregata in Scala. La scelta di Scala come linguaggio host nasce da motivi tecnici e pratici. Scala è un linguaggio moderno, interoperabile con Java, che ben integra i paradigmi ad oggetti e funzionale, ha un sistema di tipi espressivo, e fornisce funzionalità avanzate per lo sviluppo di librerie e DSL. Inoltre, la possibilità di appoggiarsi, su Scala, ad un framework ad attori solido come Akka, costituisce un altro fattore trainante, data la necessità di colmare l'abstraction gap inerente allo sviluppo di un middleware distribuito. Nell'elaborato di tesi si presenta un framework che raggiunge il triplice obiettivo: la costruzione di una libreria Scala che realizza la semantica del field calculus in modo corretto e completo, la realizzazione di una piattaforma distribuita Akka-based su cui sviluppare applicazioni, e l'esposizione di un'API generale e flessibile in grado di supportare diversi scenari.
Resumo:
The Bergman cyclization of large polycyclic enediyne systems that mimic the cores of the enediyne anticancer antibiotics was studied using the ONIOM hybrid method. Tests on small enediynes show that ONIOM can accurately match experimental data. The effect of the triggering reaction in the natural products is investigated, and we support the argument that it is strain effects that lower the cyclization barrier. The barrier for the triggered molecule is very low, leading to a reasonable half-life at biological temperatures. No evidence is found that would suggest a concerted cyclization/H-atom abstraction mechanism is necessary for DNA cleavage.
Resumo:
The AM1 and PM3 molecular orbital methods have been utilized to investigate the reactions of CH20H with NO and NO2 PM3 and AM1 calculated heats of formation differ from experimental values by 8.6 and 18.8 kcal mol-', respectively. The dominant reaction of CH20H with NO is predicted to produce the adduct HOCH2N0, supporting the hypothesis of Pagsberg, Munk, Anastasi, and Simpson. Calculated activation energies for the NO2 system predict the formation of the adducts HOCH2N02 and HOCH20N0. In addition, the PM3 calculations predict that the abstraction reaction producing CH20 and HN02 is more likely than one producing CH20 and HONO from reactions of CH20H with NO2.
Resumo:
Over the past 7 years, the enediyne anticancer antibiotics have been widely studied due to their DNA cleaving ability. The focus of these antibiotics, represented by kedarcidin chromophore, neocarzinostatin chromophore, calicheamicin, esperamicin A, and dynemicin A, is on the enediyne moiety contained within each of these antibiotics. In its inactive form, the moiety is benign to its environment. Upon suitable activation, the system undergoes a Bergman cycloaromatization proceeding through a 1,4-dehydrobenzene diradical intermediate. It is this diradical intermediate that is thought to cleave double-stranded dna through hydrogen atom abstraction. Semiempirical, semiempiricalci, Hartree–Fock ab initio, and mp2 electron correlation methods have been used to investigate the inactive hex-3-ene-1,5-diyne reactant, the 1,4-dehydrobenzene diradical, and a transition state structure of the Bergman reaction. Geometries calculated with different basis sets and by semiempirical methods have been used for single-point calculations using electron correlation methods. These results are compared with the best experimental and theoretical results reported in the literature. Implications of these results for computational studies of the enediyne anticancer antibiotics are discussed.
Resumo:
The Simulation Automation Framework for Experiments (SAFE) is a project created to raise the level of abstraction in network simulation tools and thereby address issues that undermine credibility. SAFE incorporates best practices in network simulationto automate the experimental process and to guide users in the development of sound scientific studies using the popular ns-3 network simulator. My contributions to the SAFE project: the design of two XML-based languages called NEDL (ns-3 Experiment Description Language) and NSTL (ns-3 Script Templating Language), which facilitate the description of experiments and network simulationmodels, respectively. The languages provide a foundation for the construction of better interfaces between the user and the ns-3 simulator. They also provide input to a mechanism which automates the execution of network simulation experiments. Additionally,this thesis demonstrates that one can develop tools to generate ns-3 scripts in Python or C++ automatically from NSTL model descriptions.
Resumo:
When reengineering legacy systems, it is crucial to assess if the legacy behavior has been preserved or how it changed due to the reengineering effort. Ideally if a legacy system is covered by tests, running the tests on the new version can identify potential differences or discrepancies. However, writing tests for an unknown and large system is difficult due to the lack of internal knowledge. It is especially difficult to bring the system to an appropriate state. Our solution is based on the acknowledgment that one of the few trustable piece of information available when approaching a legacy system is the running system itself. Our approach reifies the execution traces and uses logic programming to express tests on them. Thereby it eliminates the need to programatically bring the system in a particular state, and handles the test-writer a high-level abstraction mechanism to query the trace. The resulting system, called TESTLOG, was used on several real-world case studies to validate our claims.
Resumo:
Most of today's dynamic analysis approaches are based on method traces. However, in the case of object-orientation understanding program execution by analyzing method traces is complicated because the behavior of a program depends on the sharing and the transfer of object references (aliasing). We argue that trace-based dynamic analysis is at a too low level of abstraction for object-oriented systems. We propose a new approach that captures the life cycle of objects by explicitly taking into account object aliasing and how aliases propagate during the execution of the program. In this paper, we present in detail our new meta-model and discuss future tracks opened by it.
Resumo:
Virtualization has become a common abstraction layer in modern data centers. By multiplexing hardware resources into multiple virtual machines (VMs) and thus enabling several operating systems to run on the same physical platform simultaneously, it can effectively reduce power consumption and building size or improve security by isolating VMs. In a virtualized system, memory resource management plays a critical role in achieving high resource utilization and performance. Insufficient memory allocation to a VM will degrade its performance dramatically. On the contrary, over-allocation causes waste of memory resources. Meanwhile, a VM’s memory demand may vary significantly. As a result, effective memory resource management calls for a dynamic memory balancer, which, ideally, can adjust memory allocation in a timely manner for each VM based on their current memory demand and thus achieve the best memory utilization and the optimal overall performance. In order to estimate the memory demand of each VM and to arbitrate possible memory resource contention, a widely proposed approach is to construct an LRU-based miss ratio curve (MRC), which provides not only the current working set size (WSS) but also the correlation between performance and the target memory allocation size. Unfortunately, the cost of constructing an MRC is nontrivial. In this dissertation, we first present a low overhead LRU-based memory demand tracking scheme, which includes three orthogonal optimizations: AVL-based LRU organization, dynamic hot set sizing and intermittent memory tracking. Our evaluation results show that, for the whole SPEC CPU 2006 benchmark suite, after applying the three optimizing techniques, the mean overhead of MRC construction is lowered from 173% to only 2%. Based on current WSS, we then predict its trend in the near future and take different strategies for different prediction results. When there is a sufficient amount of physical memory on the host, it locally balances its memory resource for the VMs. Once the local memory resource is insufficient and the memory pressure is predicted to sustain for a sufficiently long time, a relatively expensive solution, VM live migration, is used to move one or more VMs from the hot host to other host(s). Finally, for transient memory pressure, a remote cache is used to alleviate the temporary performance penalty. Our experimental results show that this design achieves 49% center-wide speedup.
Resumo:
AIM: The purpose of this study was to systematically review the literature on the survival rates of palatal implants, Onplants((R)), miniplates and mini screws. MATERIAL AND METHODS: An electronic MEDLINE search supplemented by manual searching was conducted to identify randomized clinical trials, prospective and retrospective cohort studies on palatal implants, Onplants((R)), miniplates and miniscrews with a mean follow-up time of at least 12 weeks and of at least 10 units per modality having been examined clinically at a follow-up visit. Assessment of studies and data abstraction was performed independently by two reviewers. Reported failures of used devices were analyzed using random-effects Poisson regression models to obtain summary estimates and 95% confidence intervals (CI) of failure and survival proportions. RESULTS: The search up to January 2009 provided 390 titles and 71 abstracts with full-text analysis of 34 articles, yielding 27 studies that met the inclusion criteria. In meta-analysis, the failure rate for Onplants((R)) was 17.2% (95% CI: 5.9-35.8%), 10.5% for palatal implants (95% CI: 6.1-18.1%), 16.4% for miniscrews (95% CI: 13.4-20.1%) and 7.3% for miniplates (95% CI: 5.4-9.9%). Miniplates and palatal implants, representing torque-resisting temporary anchorage devices (TADs), when grouped together, showed a 1.92-fold (95% CI: 1.06-2.78) lower clinical failure rate than miniscrews. CONCLUSION: Based on the available evidence in the literature, palatal implants and miniplates showed comparable survival rates of >or=90% over a period of at least 12 weeks, and yielded superior survival than miniscrews. Palatal implants and miniplates for temporary anchorage provide reliable absolute orthodontic anchorage. If the intended orthodontic treatment would require multiple miniscrew placement to provide adequate anchorage, the reliability of such systems is questionable. For patients who are undergoing extensive orthodontic treatment, force vectors may need to be varied or the roots of the teeth to be moved may need to slide past the anchors. In this context, palatal implants or miniplates should be the TADs of choice.
Resumo:
OBJECTIVES: The objective of this systematic review was to assess the 5-year survival rates and incidences of complications associated with ceramic abutments and to compare them with those of metal abutments. METHODS: An electronic Medline search complemented by manual searching was conducted to identify randomized-controlled clinical trials, and prospective and retrospective studies providing information on ceramic and metal abutments with a mean follow-up time of at least 3 years. Patients had to have been examined clinically at the follow-up visit. Assessment of the identified studies and data abstraction was performed independently by three reviewers. Failure rates were analyzed using standard and random-effects Poisson regression models to obtain summary estimates of 5-year survival proportions. RESULTS: Twenty-nine clinical and 22 laboratory studies were selected from an initial yield of 7136 titles and data were extracted. The estimated 5-year survival rate of ceramic abutments was 99.1% [95% confidence interval (CI): 93.8-99.9%] and 97.4% (95% CI: 96-98.3%) for metal abutments. The estimated cumulative incidence of technical complications after 5 years was 6.9% (95% CI: 3.5-13.4%) for ceramic abutments and 15.9% (95% CI: 11.6-21.5%) for metal abutments. Abutment screw loosening was the most frequent technical problem, occurring at an estimated cumulative incidence after 5 years of 5.1% (95% CI: 3.3-7.7%). All-ceramic crowns supported by ceramic abutments exhibited similar annual fracture rates as metal-ceramic crowns supported by metal abutments. The cumulative incidence of biological complications after 5 years was estimated at 5.2% (95% CI: 0.4-52%) for ceramic and 7.7% (95% CI: 4.7-12.5%) for metal abutments. Esthetic complications tended to be more frequent at metal abutments. A meta-analysis of the laboratory data was impossible due to the non-standardized test methods of the studies included. CONCLUSION: The 5-year survival rates estimated from annual failure rates appeared to be similar for ceramic and metal abutments. The information included in this review did not provide evidence for differences of the technical and biological outcomes of ceramic and metal abutments. However, the information for ceramic abutments was limited in the number of studies and abutments analyzed as well as the accrued follow-up time. Standardized methods for the analysis of abutment strength are needed.