21 resultados para parallel robots,cable driven,underactuated,calibration,sensitivity,accuracy


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the several issues faced in the past, the evolutionary trend of silicon has kept its constant pace. Today an ever increasing number of cores is integrated onto the same die. Unfortunately, the extraordinary performance achievable by the many-core paradigm is limited by several factors. Memory bandwidth limitation, combined with inefficient synchronization mechanisms, can severely overcome the potential computation capabilities. Moreover, the huge HW/SW design space requires accurate and flexible tools to perform architectural explorations and validation of design choices. In this thesis we focus on the aforementioned aspects: a flexible and accurate Virtual Platform has been developed, targeting a reference many-core architecture. Such tool has been used to perform architectural explorations, focusing on instruction caching architecture and hybrid HW/SW synchronization mechanism. Beside architectural implications, another issue of embedded systems is considered: energy efficiency. Near Threshold Computing is a key research area in the Ultra-Low-Power domain, as it promises a tenfold improvement in energy efficiency compared to super-threshold operation and it mitigates thermal bottlenecks. The physical implications of modern deep sub-micron technology are severely limiting performance and reliability of modern designs. Reliability becomes a major obstacle when operating in NTC, especially memory operation becomes unreliable and can compromise system correctness. In the present work a novel hybrid memory architecture is devised to overcome reliability issues and at the same time improve energy efficiency by means of aggressive voltage scaling when allowed by workload requirements. Variability is another great drawback of near-threshold operation. The greatly increased sensitivity to threshold voltage variations in today a major concern for electronic devices. We introduce a variation-tolerant extension of the baseline many-core architecture. By means of micro-architectural knobs and a lightweight runtime control unit, the baseline architecture becomes dynamically tolerant to variations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is focused on the study of saltwater intrusion in coastal aquifers, and in particular on the realization of conceptual schemes to evaluate the risk associated with it. Saltwater intrusion depends on different natural and anthropic factors, both presenting a strong aleatory behaviour, that should be considered for an optimal management of the territory and water resources. Given the uncertainty of problem parameters, the risk associated with salinization needs to be cast in a probabilistic framework. On the basis of a widely adopted sharp interface formulation, key hydrogeological problem parameters are modeled as random variables, and global sensitivity analysis is used to determine their influence on the position of saltwater interface. The analyses presented in this work rely on an efficient model reduction technique, based on Polynomial Chaos Expansion, able to combine the best description of the model without great computational burden. When the assumptions of classical analytical models are not respected, and this occurs several times in the applications to real cases of study, as in the area analyzed in the present work, one can adopt data-driven techniques, based on the analysis of the data characterizing the system under study. It follows that a model can be defined on the basis of connections between the system state variables, with only a limited number of assumptions about the "physical" behaviour of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neuroinflammation constitutes a major player in the etiopathology of neurodegenerative diseases (NDDs), by orchestrating several neurotoxic pathways which in concert lead to neurodegeneration. A positive feedback loop occurs between inflammation, microglia activation and misfolding processes that, alongside excitotoxicity and oxidative events, represent crucial features of this intricate scenario. The multi-layered nature of NDDs requires a deepen investigation on how these vicious cycles work. This could further help in the search for effective treatments. Electrophiles are critically involved in the modulation of a variety of neuroprotective responses. Thus, we envisioned their peculiar ability to switch on/off biological activities as a powerful tool for investigating the neurotoxic scenario driven by inflammation in NDDs. In particular, in this thesis project, we wanted to dissect at a molecular level the functional role of (pro)electrophilic moieties of previously synthesized thioesters of variously substituted trans-cinnamic acids, to identify crucial features which could interfere with amyloid aggregation as well as modulate Nrf2 and/or NF-κB activation. To this aim, we first synthesized new compounds to identify bioactive cores which could specifically modulate the intended target. Then, we systematically modified their structure to reach additional pathogenic pathways which could in tandem contribute to the inflammatory process. In particular, following the investigation of the mechanistic underpinnings involving the catechol feature in amyloid binding through the synthesis of new dihydroxyl derivatives, we incorporated the identified antiaggregating nucleus into constrained frames which could contrast neuroinflammation also through the modulation of CB2Rs. In parallel, Nrf2 and/or NF-κB antinflammatory structural requirements were combined with the neuroprotective cores of pioglitazone, an antidiabetic drug endowed with MAO-B inhibitory properties, and memantine, which notably contrasts excitotoxicity. By acting as Swiss army knives, the new set of molecules emerge as promising tools to deepen our insights into the complex scenario regulating NDDs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intelligent systems are currently inherent to the society, supporting a synergistic human-machine collaboration. Beyond economical and climate factors, energy consumption is strongly affected by the performance of computing systems. The quality of software functioning may invalidate any improvement attempt. In addition, data-driven machine learning algorithms are the basis for human-centered applications, being their interpretability one of the most important features of computational systems. Software maintenance is a critical discipline to support automatic and life-long system operation. As most software registers its inner events by means of logs, log analysis is an approach to keep system operation. Logs are characterized as Big data assembled in large-flow streams, being unstructured, heterogeneous, imprecise, and uncertain. This thesis addresses fuzzy and neuro-granular methods to provide maintenance solutions applied to anomaly detection (AD) and log parsing (LP), dealing with data uncertainty, identifying ideal time periods for detailed software analyses. LP provides deeper semantics interpretation of the anomalous occurrences. The solutions evolve over time and are general-purpose, being highly applicable, scalable, and maintainable. Granular classification models, namely, Fuzzy set-Based evolving Model (FBeM), evolving Granular Neural Network (eGNN), and evolving Gaussian Fuzzy Classifier (eGFC), are compared considering the AD problem. The evolving Log Parsing (eLP) method is proposed to approach the automatic parsing applied to system logs. All the methods perform recursive mechanisms to create, update, merge, and delete information granules according with the data behavior. For the first time in the evolving intelligent systems literature, the proposed method, eLP, is able to process streams of words and sentences. Essentially, regarding to AD accuracy, FBeM achieved (85.64+-3.69)%; eGNN reached (96.17+-0.78)%; eGFC obtained (92.48+-1.21)%; and eLP reached (96.05+-1.04)%. Besides being competitive, eLP particularly generates a log grammar, and presents a higher level of model interpretability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The most widespread work-related diseases are musculoskeletal disorders (MSD) caused by awkward postures and excessive effort to upper limb muscles during work operations. The use of wearable IMU sensors could monitor the workers constantly to prevent hazardous actions, thus diminishing work injuries. In this thesis, procedures are developed and tested for ergonomic analyses in a working environment, based on a commercial motion capture system (MoCap) made of 17 Inertial Measurement Units (IMUs). An IMU is usually made of a tri-axial gyroscope, a tri-axial accelerometer, and a tri-axial magnetometer that, through sensor fusion algorithms, estimates its attitude. Effective strategies for preventing MSD rely on various aspects: firstly, the accuracy of the IMU, depending on the chosen sensor and its calibration; secondly, the correct identification of the pose of each sensor on the worker’s body; thirdly, the chosen multibody model, which must consider both the accuracy and the computational burden, to provide results in real-time; finally, the model scaling law, which defines the possibility of a fast and accurate personalization of the multibody model geometry. Moreover, the MSD can be diminished using collaborative robots (cobots) as assisted devices for complex or heavy operations to relieve the worker's effort during repetitive tasks. All these aspects are considered to test and show the efficiency and usability of inertial MoCap systems for assessing ergonomics evaluation in real-time and implementing safety control strategies in collaborative robotics. Validation is performed with several experimental tests, both to test the proposed procedures and to compare the results of real-time multibody models developed in this thesis with the results from commercial software. As an additional result, the positive effects of using cobots as assisted devices for reducing human effort in repetitive industrial tasks are also shown, to demonstrate the potential of wearable electronics in on-field ergonomics analyses for industrial applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present Dissertation shows how recent statistical analysis tools and open datasets can be exploited to improve modelling accuracy in two distinct yet interconnected domains of flood hazard (FH) assessment. In the first Part, unsupervised artificial neural networks are employed as regional models for sub-daily rainfall extremes. The models aim to learn a robust relation to estimate locally the parameters of Gumbel distributions of extreme rainfall depths for any sub-daily duration (1-24h). The predictions depend on twenty morphoclimatic descriptors. A large study area in north-central Italy is adopted, where 2238 annual maximum series are available. Validation is performed over an independent set of 100 gauges. Our results show that multivariate ANNs may remarkably improve the estimation of percentiles relative to the benchmark approach from the literature, where Gumbel parameters depend on mean annual precipitation. Finally, we show that the very nature of the proposed ANN models makes them suitable for interpolating predicted sub-daily rainfall quantiles across space and time-aggregation intervals. In the second Part, decision trees are used to combine a selected blend of input geomorphic descriptors for predicting FH. Relative to existing DEM-based approaches, this method is innovative, as it relies on the combination of three characteristics: (1) simple multivariate models, (2) a set of exclusively DEM-based descriptors as input, and (3) an existing FH map as reference information. First, the methods are applied to northern Italy, represented with the MERIT DEM (∼90m resolution), and second, to the whole of Italy, represented with the EU-DEM (25m resolution). The results show that multivariate approaches may (a) significantly enhance flood-prone areas delineation relative to a selected univariate one, (b) provide accurate predictions of expected inundation depths, (c) produce encouraging results in extrapolation, (d) complete the information of imperfect reference maps, and (e) conveniently convert binary maps into continuous representation of FH.