10 resultados para Integrity constraints

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Singularities of robot manipulators have been intensely studied in the last decades by researchers of many fields. Serial singularities produce some local loss of dexterity of the manipulator, therefore it might be desirable to search for singularityfree trajectories in the jointspace. On the other hand, parallel singularities are very dangerous for parallel manipulators, for they may provoke the local loss of platform control, and jeopardize the structural integrity of links or actuators. It is therefore utterly important to avoid parallel singularities, while operating a parallel machine. Furthermore, there might be some configurations of a parallel manipulators that are allowed by the constraints, but nevertheless are unreachable by any feasible path. The present work proposes a numerical procedure based upon Morse theory, an important branch of differential topology. Such procedure counts and identify the singularity-free regions that are cut by the singularity locus out of the configuration space, and the disjoint regions composing the configuration space of a parallel manipulator. Moreover, given any two configurations of a manipulator, a feasible or a singularity-free path connecting them can always be found, or it can be proved that none exists. Examples of applications to 3R and 6R serial manipulators, to 3UPS and 3UPU parallel wrists, to 3UPU parallel translational manipulators, and to 3RRR planar manipulators are reported in the work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A full set of geochemical and Sr, Nd and Pb isotope data both on bulk-rock and mineral samples is provided for volcanic rocks representative of the whole stratigraphic succession of Lipari Island in the Aeolian archipelago. These data, together with petrographic observations and melt/fluid inclusion investigations from the literature, give outlines on the petrogenesis and evolution of magmas through the magmatic and eruptive history of Lipari. This is the result of nine successive Eruptive Epochs developing between 271 ka and historical times, as derived from recentmost volcanological and stratigraphic studies, combined with available radiometric ages and correlation of tephra layers and marine terrace deposits. These Eruptive Epochs are characterized by distinctive vents partly overlapping in space and time, mostly under control of the main regional tectonic trends (NNW-SSE, N-S and minor E-W). A large variety of lava flows, scoriaceous deposits, lava domes, coulees and pyroclastics are emplaced, ranging in composition through time from calcalkaline (CA) and high-K (HKCA) basaltic andesites to rhyolites. CA and HKCA basaltic andesitic to dacitic magmas were erupted between 271 and 81 ka (Eruptive Epochs 1-6) from volcanic edifices located along the western coast of the island (and subordinately the eastern Monterosa) and the M.Chirica and M.S.Angelo stratocones. These mafic to intermediate magmas mainly evolved through AFC and RAFC processes, involving fractionation of mafic phases, assimilation of wall rocks and mixing with newly injected mafic magmas. Following a 40 ka-long period of volcanic quiescence, the rhyolitic magmas were lately erupted from eruptive vents located in the southern and north-eastern sectors of Lipari between 40 ka and historical times (Eruptive Epochs 7-9). They are suggested to derive from the previous mafic to intermediate melts through AFC processes. During the early phases of rhyolitic magmatism (Eruptive Epochs 7-8), enclaves-rich rocks and banded pumices, ranging in composition from HKCA dacites to low-SiO2 rhyolites were erupted, representing the products of magma mixing between fresh mafic magmas and the fractionated rhyolitic melts. The interaction of mantle-derived magmas with the crust represents an essential process during the whole magmatic hystory of Lipari, and is responsible for the wide range of observed geochemical and isotopic variations. The crustal contribution was particularly important during the intermediate phases of activity of Lipari when the cordierite-bearing lavas were erupted from the M. S.Angelo volcano (Eruptive Epoch 5, 105 ka). These lavas are interpreted as the result of mixing and subsequent hybridization of mantle-derived magmas, akin to the ones characterizing the older phases of activity of Lipari (Eruptive Epochs 1-4), and crustal anatectic melts derived from dehydration-melting reactions of metapelites in the lower crust. A comparison between the adjacent islands of Lipari and Vulcano outlines that their mafic to intermediate magmas seem to be genetically connected and derive from a similar mantle source affected by different degrees of partial melting (and variable extent of crustal assimilation) producing either the CA magmas of Lipari (higher degrees) or the HKCA to SHO magmas of Vulcano (lower degrees). On a regional scale, the most primitive rocks (SiO2<56%, MgO>3.5%) of Lipari, Vulcano, Salina and Filicudi are suggested to derive from a similar MORB-like source, variably metasomatized by aqueous fluids coming from the slab and subordinately by the additions of sediments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate the benefits that emerge when the fields of constraint programming and concurrency meet. On one hand, constraints can be use in concurrency theory to increase the conciseness and the expressive power of concurrent languages from a pragmatic point of view. On the other hand, problems modeled by using constraints can be solved faster and more efficiently using a concurrent system. We explore both directions providing two separate lines of contribution. Firstly we study the expressive power of a concurrent language, namely Constraint Handling Rules, that supports constraints as a primitive construct. We show what features of this language make it Turing powerful. Then we propose a framework to solve constraint problems that is intended to be deployed on a concurrent system. For the development of this framework we used the concurrent language Jolie following the Service Oriented paradigm. Based on this experience, we also propose an extension to Service Oriented Languages to overcome some of their limitations and to improve the development of concurrent applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cost, performance and availability considerations are forcing even the most conservative high-integrity embedded real-time systems industry to migrate from simple hardware processors to ones equipped with caches and other acceleration features. This migration disrupts the practices and solutions that industry had developed and consolidated over the years to perform timing analysis. Industry that are confident with the efficiency/effectiveness of their verification and validation processes for old-generation processors, do not have sufficient insight on the effects of the migration to cache-equipped processors. Caches are perceived as an additional source of complexity, which has potential for shattering the guarantees of cost- and schedule-constrained qualification of their systems. The current industrial approach to timing analysis is ill-equipped to cope with the variability incurred by caches. Conversely, the application of advanced WCET analysis techniques on real-world industrial software, developed without analysability in mind, is hardly feasible. We propose a development approach aimed at minimising the cache jitters, as well as at enabling the application of advanced WCET analysis techniques to industrial systems. Our approach builds on:(i) identification of those software constructs that may impede or complicate timing analysis in industrial-scale systems; (ii) elaboration of practical means, under the model-driven engineering (MDE) paradigm, to enforce the automated generation of software that is analyzable by construction; (iii) implementation of a layout optimisation method to remove cache jitters stemming from the software layout in memory, with the intent of facilitating incremental software development, which is of high strategic interest to industry. The integration of those constituents in a structured approach to timing analysis achieves two interesting properties: the resulting software is analysable from the earliest releases onwards - as opposed to becoming so only when the system is final - and more easily amenable to advanced timing analysis by construction, regardless of the system scale and complexity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this thesis was to investigate the respective contribution of prior information and sensorimotor constraints to action understanding, and to estimate their consequences on the evolution of human social learning. Even though a huge amount of literature is dedicated to the study of action understanding and its role in social learning, these issues are still largely debated. Here, I critically describe two main perspectives. The first perspective interprets faithful social learning as an outcome of a fine-grained representation of others’ actions and intentions that requires sophisticated socio-cognitive skills. In contrast, the second perspective highlights the role of simpler decision heuristics, the recruitment of which is determined by individual and ecological constraints. The present thesis aims to show, through four experimental works, that these two contributions are not mutually exclusive. A first study investigates the role of the inferior frontal cortex (IFC), the anterior intraparietal area (AIP) and the primary somatosensory cortex (S1) in the recognition of other people’s actions, using a transcranial magnetic stimulation adaptation paradigm (TMSA). The second work studies whether, and how, higher-order and lower-order prior information (acquired from the probabilistic sampling of past events vs. derived from an estimation of biomechanical constraints of observed actions) interacts during the prediction of other people’s intentions. Using a single-pulse TMS procedure, the third study investigates whether the interaction between these two classes of priors modulates the motor system activity. The fourth study tests the extent to which behavioral and ecological constraints influence the emergence of faithful social learning strategies at a population level. The collected data contribute to elucidate how higher-order and lower-order prior expectations interact during action prediction, and clarify the neural mechanisms underlying such interaction. Finally, these works provide/open promising perspectives for a better understanding of social learning, with possible extensions to animal models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis addresses the issue of generating texts in the style of an existing author, that also satisfy structural constraints imposed by the genre of the text. Although Markov processes are known to be suitable for representing style, they are difficult to control in order to satisfy non-local properties, such as structural constraints, that require long distance modeling. The framework of Constrained Markov Processes allows to precisely generate texts that are consistent with a corpus, while being controllable in terms of rhymes and meter. Controlled Markov processes consist in reformulating Markov processes in the context of constraint satisfaction. The thesis describes how to represent stylistic and structural properties in terms of constraints in this framework and how this approach can be used for the generation of lyrics in the style of 60 differents authors An evaluation of the desctibed method is provided by comparing it to both pure Markov and pure constraint-based approaches. Finally the thesis describes the implementation of an augmented text editor, called Perec. Perec is intended to improve creativity, by helping the user to write lyrics and poetry, exploiting the techniques presented so far.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyzes the effect that different designs in the access to fnancial transmission rights has on spot electricity auctions. In particular, I characterize the equilibrium in the spot electricity market when financial transmission rights are assigned to the grid operator and when financial transmission rights are assigned to the firm that submits the lowest bid in the spot electricity auction. When financial transmission rights are assigned to the grid operator, my model, in contrast with the models available in the literature, works out the equilibrium for any transmission capacity. Moreover, I have found that an increase in transmission capacity not only increases competition between markets but also within a single market. When financial transmission rights are assigned to the firm that submits the lowest bid in the spot electricity auction, firms compete not only for electricity demand, but also for transmission rights and the arbitrage profits derived from its hold. I have found that introduce competition for transmission rights reduces competition in spot electricity auctions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of High-Integrity Real-Time Systems has a high footprint in terms of human, material and schedule costs. Factoring functional, reusable logic in the application favors incremental development and contains costs. Yet, achieving incrementality in the timing behavior is a much harder problem. Complex features at all levels of the execution stack, aimed to boost average-case performance, exhibit timing behavior highly dependent on execution history, which wrecks time composability and incrementaility with it. Our goal here is to restitute time composability to the execution stack, working bottom up across it. We first characterize time composability without making assumptions on the system architecture or the software deployment to it. Later, we focus on the role played by the real-time operating system in our pursuit. Initially we consider single-core processors and, becoming less permissive on the admissible hardware features, we devise solutions that restore a convincing degree of time composability. To show what can be done for real, we developed TiCOS, an ARINC-compliant kernel, and re-designed ORK+, a kernel for Ada Ravenscar runtimes. In that work, we added support for limited-preemption to ORK+, an absolute premiere in the landscape of real-word kernels. Our implementation allows resource sharing to co-exist with limited-preemptive scheduling, which extends state of the art. We then turn our attention to multicore architectures, first considering partitioned systems, for which we achieve results close to those obtained for single-core processors. Subsequently, we shy away from the over-provision of those systems and consider less restrictive uses of homogeneous multiprocessors, where the scheduling algorithm is key to high schedulable utilization. To that end we single out RUN, a promising baseline, and extend it to SPRINT, which supports sporadic task sets, hence matches real-world industrial needs better. To corroborate our results we present findings from real-world case studies from avionic industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Redshift Space Distortions (RSD) are an apparent anisotropy in the distribution of galaxies due to their peculiar motion. These features are imprinted in the correlation function of galaxies, which describes how these structures distribute around each other. RSD can be represented by a distortions parameter $\beta$, which is strictly related to the growth of cosmic structures. For this reason, measurements of RSD can be exploited to give constraints on the cosmological parameters, such us for example the neutrino mass. Neutrinos are neutral subatomic particles that come with three flavours, the electron, the muon and the tau neutrino. Their mass differences can be measured in the oscillation experiments. Information on the absolute scale of neutrino mass can come from cosmology, since neutrinos leave a characteristic imprint on the large scale structure of the universe. The aim of this thesis is to provide constraints on the accuracy with which neutrino mass can be estimated when expoiting measurements of RSD. In particular we want to describe how the error on the neutrino mass estimate depends on three fundamental parameters of a galaxy redshift survey: the density of the catalogue, the bias of the sample considered and the volume observed. In doing this we make use of the BASICC Simulation from which we extract a series of dark matter halo catalogues, characterized by different value of bias, density and volume. This mock data are analysed via a Markov Chain Monte Carlo procedure, in order to estimate the neutrino mass fraction, using the software package CosmoMC, which has been conveniently modified. In this way we are able to extract a fitting formula describing our measurements, which can be used to forecast the precision reachable in future surveys like Euclid, using this kind of observations.