20 resultados para Concept-based Terminology

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The abundance of visual data and the push for robust AI are driving the need for automated visual sensemaking. Computer Vision (CV) faces growing demand for models that can discern not only what images "represent," but also what they "evoke." This is a demand for tools mimicking human perception at a high semantic level, categorizing images based on concepts like freedom, danger, or safety. However, automating this process is challenging due to entropy, scarcity, subjectivity, and ethical considerations. These challenges not only impact performance but also underscore the critical need for interoperability. This dissertation focuses on abstract concept-based (AC) image classification, guided by three technical principles: situated grounding, performance enhancement, and interpretability. We introduce ART-stract, a novel dataset of cultural images annotated with ACs, serving as the foundation for a series of experiments across four key domains: assessing the effectiveness of the end-to-end DL paradigm, exploring cognitive-inspired semantic intermediaries, incorporating cultural and commonsense aspects, and neuro-symbolic integration of sensory-perceptual data with cognitive-based knowledge. Our results demonstrate that integrating CV approaches with semantic technologies yields methods that surpass the current state of the art in AC image classification, outperforming the end-to-end deep vision paradigm. The results emphasize the role semantic technologies can play in developing both effective and interpretable systems, through the capturing, situating, and reasoning over knowledge related to visual data. Furthermore, this dissertation explores the complex interplay between technical and socio-technical factors. By merging technical expertise with an understanding of human and societal aspects, we advocate for responsible labeling and training practices in visual media. These insights and techniques not only advance efforts in CV and explainable artificial intelligence but also propel us toward an era of AI development that harmonizes technical prowess with deep awareness of its human and societal implications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In Chapter 1 I will present a brief introduction on the state of art of nanotechnologies, nanofabrication techniques and unconventional lithography as a technique to fabricate the novel electronic device as resistive switch so-called memristor is shown. In Chapter 2 a detailed description of the main fabrication and characterization techniques employed in this work is reported. Chapter 3 parallel local oxidation lithography (pLOx) describes as a main technique to obtain accurate patterning process. All the effective parameters has been studied and the optimized condition observed to highly reproducible with excellent patterned nanostructures. The effect of negative bias, calls local reduction (LR) studied. Moreover, the use of AC bias shows faster patterning process respect to DC bias. In Chapter 4 (metal/ e-SiO2/ Si nanojunction) it is shown how the electrochemical oxide nanostructures by using pLOx can be used in the fabrication of novel devices call memristor. We demonstrate a new concept, based on conventional materials, where the lifetime problem is resolved by introducing a “regeneration” step, which restores the nano-memristor to its pristine condition by applying an appropriate voltage cycle. In Chapter 5 (Graphene/ e-SiO2/ Si), Graphene as a building block material is used as an electrode to selectively oxidize the silicon substrate by pLOx set up for the fabrication of novel resistive switch device. In Chapter 6 (surface architecture) I will show another application of pLOx in biotechnology is shown. So the surface functionalization combine with nano-patterning by pLOx used to design a new surface to accurately bind biomolecules with the possibility of studying those properties and more application in nano-bio device fabrication. So, in order to obtain biochips, electronic and optical/photonics devices Nano patterning of DNA used as scaffolds to fabricate small functional nano-components.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Deep Underground Neutrino Experiment (DUNE) is a long-baseline accelerator experiment designed to make a significant contribution to the study of neutrino oscillations with unprecedented sensitivity. The main goal of DUNE is the determination of the neutrino mass ordering and the leptonic CP violation phase, key parameters of the three-neutrino flavor mixing that have yet to be determined. An important component of the DUNE Near Detector complex is the System for on-Axis Neutrino Detection (SAND) apparatus, which will include GRAIN (GRanular Argon for Interactions of Neutrinos), a novel liquid Argon detector aimed at imaging neutrino interactions using only scintillation light. For this purpose, an innovative optical readout system based on Coded Aperture Masks is investigated. This dissertation aims to demonstrate the feasibility of reconstructing particle tracks and the topology of CCQE (Charged Current Quasi Elastic) neutrino events in GRAIN with such a technique. To this end, the development and implementation of a reconstruction algorithm based on Maximum Likelihood Expectation Maximization was carried out to directly obtain a three-dimensional distribution proportional to the energy deposited by charged particles crossing the LAr volume. This study includes the evaluation of the design of several camera configurations and the simulation of a multi-camera optical system in GRAIN.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

DUNE is a next-generation long-baseline neutrino oscillation experiment. It aims to measure the still unknown $ \delta_{CP} $ violation phase and the sign of $ \Delta m_{13}^2 $, which defines the neutrino mass ordering. DUNE will exploit a Far Detector composed of four multi-kiloton LArTPCs, and a Near Detector (ND) complex located close to the neutrino source at Fermilab. The SAND detector at the ND complex is designed to perform on-axis beam monitoring, constrain uncertainties in the oscillation analysis and perform precision neutrino physics measurements. SAND includes a 0.6 T super-conductive magnet, an electromagnetic calorimeter, a 1-ton liquid Argon detector - GRAIN - and a modular, low-density straw tube target tracker system. GRAIN is an innovative LAr detector where neutrino interactions can be reconstructed using only the LAr scintillation light imaged by an optical system based on Coded Aperture masks and lenses - a novel approach never used before in particle physics applications. In this thesis, a first evaluation of GRAIN track reconstruction and calorimetric capabilities was obtained with an optical system based on Coded Aperture cameras. A simulation of $\nu_\mu + Ar$ interactions with the energy spectrum expected at the future Fermilab Long Baseline Neutrino Facility (LBNF) was performed. The performance of SAND was evaluated, combining the information provided by all its sub-detectors, on the selection of $ \nu_\mu + Ar \to \mu^- + p + X $ sample and on the neutrino energy reconstruction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electronic business surely represents the new development perspective for world-wide trade. Together with the idea of ebusiness, and the exigency to exchange business messages between trading partners, the concept of business-to-business (B2B) integration arouse. B2B integration is becoming necessary to allow partners to communicate and exchange business documents, like catalogues, purchase orders, reports and invoices, overcoming architectural, applicative, and semantic differences, according to the business processes implemented by each enterprise. Business relationships can be very heterogeneous, and consequently there are variousways to integrate enterprises with each other. Moreover nowadays not only large enterprises, but also the small- and medium- enterprises are moving towards ebusiness: more than two-thirds of Small and Medium Enterprises (SMEs) use the Internet as a business tool. One of the business areas which is actively facing the interoperability problem is that related with the supply chain management. In order to really allow the SMEs to improve their business and to fully exploit ICT technologies in their business transactions, there are three main players that must be considered and joined: the new emerging ICT technologies, the scenario and the requirements of the enterprises and the world of standards and standardisation bodies. This thesis presents the definition and the development of an interoperability framework (and the bounded standardisation intiatives) to provide the Textile/Clothing sectorwith a shared set of business documents and protocols for electronic transactions. Considering also some limitations, the thesis proposes a ontology-based approach to improve the functionalities of the developed framework and, exploiting the technologies of the semantic web, to improve the standardisation life-cycle, intended as the development, dissemination and adoption of B2B protocols for specific business domain. The use of ontologies allows the semantic modellisation of knowledge domains, upon which it is possible to develop a set of components for a better management of B2B protocols, and to ease their comprehension and adoption for the target users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A prevalent claim is that we are in knowledge economy. When we talk about knowledge economy, we generally mean the concept of “Knowledge-based economy” indicating the use of knowledge and technologies to produce economic benefits. Hence knowledge is both tool and raw material (people’s skill) for producing some kind of product or service. In this kind of environment economic organization is undergoing several changes. For example authority relations are less important, legal and ownership-based definitions of the boundaries of the firm are becoming irrelevant and there are only few constraints on the set of coordination mechanisms. Hence what characterises a knowledge economy is the growing importance of human capital in productive processes (Foss, 2005) and the increasing knowledge intensity of jobs (Hodgson, 1999). Economic processes are also highly intertwined with social processes: they are likely to be informal and reciprocal rather than formal and negotiated. Another important point is also the problem of the division of labor: as economic activity becomes mainly intellectual and requires the integration of specific and idiosyncratic skills, the task of dividing the job and assigning it to the most appropriate individuals becomes arduous, a “supervisory problem” (Hogdson, 1999) emerges and traditional hierarchical control may result increasingly ineffective. Not only specificity of know how makes it awkward to monitor the execution of tasks, more importantly, top-down integration of skills may be difficult because ‘the nominal supervisors will not know the best way of doing the job – or even the precise purpose of the specialist job itself – and the worker will know better’ (Hogdson,1999). We, therefore, expect that the organization of the economic activity of specialists should be, at least partially, self-organized. The aim of this thesis is to bridge studies from computer science and in particular from Peer-to-Peer Networks (P2P) to organization theories. We think that the P2P paradigm well fits with organization problems related to all those situation in which a central authority is not possible. We believe that P2P Networks show a number of characteristics similar to firms working in a knowledge-based economy and hence that the methodology used for studying P2P Networks can be applied to organization studies. Three are the main characteristics we think P2P have in common with firms involved in knowledge economy: - Decentralization: in a pure P2P system every peer is an equal participant, there is no central authority governing the actions of the single peers; - Cost of ownership: P2P computing implies shared ownership reducing the cost of owing the systems and the content, and the cost of maintaining them; - Self-Organization: it refers to the process in a system leading to the emergence of global order within the system without the presence of another system dictating this order. These characteristics are present also in the kind of firm that we try to address and that’ why we have shifted the techniques we adopted for studies in computer science (Marcozzi et al., 2005; Hales et al., 2007 [39]) to management science.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and aims: Sorafenib is the reference therapy for advanced Hepatocellular Carcinoma (HCC). No method exists to predict in the very early period subsequent individual response. Starting from the clinical experience in humans that subcutaneous metastases may rapidly change consistency under sorafenib and that elastosonography a new ultrasound based technique allows assessment of tissue stiffness, we investigated the role of elastonography in the very early prediction of tumor response to sorafenib in a HCC animal model. Methods: HCC (Huh7 cells) subcutaneous xenografting in mice was utilized. Mice were randomized to vehicle or treatment with sorafenib when tumor size was 5-10 mm. Elastosonography (Mylab 70XVG, Esaote, Genova, Italy) of the whole tumor mass on a sagittal plane with a 10 MHz linear transducer was performed at different time points from treatment start (day 0, +2, +4, +7 and +14) until mice were sacrified (day +14), with the operator blind to treatment. In order to overcome variability in absolute elasticity measurement when assessing changes over time, values were expressed in arbitrary units as relative stiffness of the tumor tissue in comparison to the stiffness of a standard reference stand-off pad lying on the skin over the tumor. Results: Sor-treated mice showed a smaller tumor size increase at day +14 in comparison to vehicle-treated (tumor volume increase +192.76% vs +747.56%, p=0.06). Among Sor-treated tumors, 6 mice showed a better response to treatment than the other 4 (increase in volume +177% vs +553%, p=0.011). At day +2, median tumor elasticity increased in Sor-treated group (+6.69%, range –30.17-+58.51%), while decreased in the vehicle group (-3.19%, range –53.32-+37.94%) leading to a significant difference in absolute values (p=0.034). From this time point onward, elasticity decreased in both groups, with similar speed over time, not being statistically different anymore. In Sor-treated mice all 6 best responders at day 14 showed an increase in elasticity at day +2 (ranging from +3.30% to +58.51%) in comparison to baseline, whereas 3 of the 4 poorer responders showed a decrease. Interestingly, these 3 tumours showed elasticity values higher than responder tumours at day 0. Conclusions: Elastosonography appears a promising non-invasive new technique for the early prediction of HCC tumor response to sorafenib. Indeed, we proved that responder tumours are characterized by an early increase in elasticity. The possibility to distinguish a priori between responders and non responders based on the higher elasticity of the latter needs to be validated in ad-hoc experiments as well as a confirmation of our results in humans is warranted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The "sustainability" concept relates to the prolonging of human economic systems with as little detrimental impact on ecological systems as possible. Construction that exhibits good environmental stewardship and practices that conserve resources in a manner that allow growth and development to be sustained for the long-term without degrading the environment are indispensable in a developed society. Past, current and future advancements in asphalt as an environmentally sustainable paving material are especially important because the quantities of asphalt used annually in Europe as well as in the U.S. are large. The asphalt industry is still developing technological improvements that will reduce the environmental impact without affecting the final mechanical performance. Warm mix asphalt (WMA) is a type of asphalt mix requiring lower production temperatures compared to hot mix asphalt (HMA), while aiming to maintain the desired post construction properties of traditional HMA. Lowering the production temperature reduce the fuel usage and the production of emissions therefore and that improve conditions for workers and supports the sustainable development. Even the crumb-rubber modifier (CRM), with shredded automobile tires and used in the United States since the mid 1980s, has proven to be an environmentally friendly alternative to conventional asphalt pavement. Furthermore, the use of waste tires is not only relevant in an environmental aspect but also for the engineering properties of asphalt [Pennisi E., 1992]. This research project is aimed to demonstrate the dual value of these Asphalt Mixes in regards to the environmental and mechanical performance and to suggest a low environmental impact design procedure. In fact, the use of eco-friendly materials is the first phase towards an eco-compatible design but it cannot be the only step. The eco-compatible approach should be extended also to the design method and material characterization because only with these phases is it possible to exploit the maximum potential properties of the used materials. Appropriate asphalt concrete characterization is essential and vital for realistic performance prediction of asphalt concrete pavements. Volumetric (Mix design) and mechanical (Permanent deformation and Fatigue performance) properties are important factors to consider. Moreover, an advanced and efficient design method is necessary in order to correctly use the material. A design method such as a Mechanistic-Empirical approach, consisting of a structural model capable of predicting the state of stresses and strains within the pavement structure under the different traffic and environmental conditions, was the application of choice. In particular this study focus on the CalME and its Incremental-Recursive (I-R) procedure, based on damage models for fatigue and permanent shear strain related to the surface cracking and to the rutting respectively. It works in increments of time and, using the output from one increment, recursively, as input to the next increment, predicts the pavement conditions in terms of layer moduli, fatigue cracking, rutting and roughness. This software procedure was adopted in order to verify the mechanical properties of the study mixes and the reciprocal relationship between surface layer and pavement structure in terms of fatigue and permanent deformation with defined traffic and environmental conditions. The asphalt mixes studied were used in a pavement structure as surface layer of 60 mm thickness. The performance of the pavement was compared to the performance of the same pavement structure where different kinds of asphalt concrete were used as surface layer. In comparison to a conventional asphalt concrete, three eco-friendly materials, two warm mix asphalt and a rubberized asphalt concrete, were analyzed. The First Two Chapters summarize the necessary steps aimed to satisfy the sustainable pavement design procedure. In Chapter I the problem of asphalt pavement eco-compatible design was introduced. The low environmental impact materials such as the Warm Mix Asphalt and the Rubberized Asphalt Concrete were described in detail. In addition the value of a rational asphalt pavement design method was discussed. Chapter II underlines the importance of a deep laboratory characterization based on appropriate materials selection and performance evaluation. In Chapter III, CalME is introduced trough a specific explanation of the different equipped design approaches and specifically explaining the I-R procedure. In Chapter IV, the experimental program is presented with a explanation of test laboratory devices adopted. The Fatigue and Rutting performances of the study mixes are shown respectively in Chapter V and VI. Through these laboratory test data the CalME I-R models parameters for Master Curve, fatigue damage and permanent shear strain were evaluated. Lastly, in Chapter VII, the results of the asphalt pavement structures simulations with different surface layers were reported. For each pavement structure, the total surface cracking, the total rutting, the fatigue damage and the rutting depth in each bound layer were analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Il progetto di ricerca che presentiamo nasce dalla virtuosa combinazione di teoria e prassi didattica nello spirito della ricerca-azione. Scopo del presente lavoro è elaborare un percorso didattico di formazione alla traduzione specializzata in ambito medico-scientifico, tecnico ed economico-giuridico per la combinazione linguistica spagnolo-italiano all’interno della cornice istituzionale concreta dell’università italiana oggi. La nostra proposta formativa si fonda su tre elementi: la ricognizione del mercato attuale della traduzione per la combinazione linguistica indicata, l’individuazione degli obiettivi formativi in base al modello di competenza traduttiva scelto, l’elaborazione del percorso didattico per competenze e basato sull’enfoque por tareas di traduzione. Nella progettazione delle modalità didattiche due sono gli aspetti che definiscono il percorso proposto: il concetto di genere testuale specializzato per la traduzione e la gestione delle informazioni mediante le nuove tecnologie (corpora, banche dati terminologiche e fraseologiche, memorie di traduzione, traduzione controllata). Il presente lavoro si articola in due parti: la prima parte (quattro capitoli) presenta l’inquadramento teorico all’interno del quale si sviluppa la riflessione intorno alla didattica della traduzione specializzata; la seconda parte (due capitoli) presenta l’inquadramento metodologico e analitico all’interno del quale si elabora la nostra proposta didattica. Nel primo capitolo si illustrano i rapporti fra traduzione e mondo professionale; nel secondo capitolo si presenta il concetto di competenza traduttiva come ponte tra la formazione e il mondo della traduzione professionale; nel terzo capitolo si ripercorrono le tappe principali dell’evoluzione della didattica della traduzione generale; nel quarto capitolo illustriamo alcune tra le più recenti e complete proposte didattiche per la traduzione specializzata in ambito tecnico, medico-scientifico ed economico-giuridico. Nel quinto capitolo si introduce il concetto di genere testuale specializzato per la traduzione e nel sesto capitolo si illustra la proposta didattica per la traduzione specializzata dallo spagnolo in italiano che ha motivato il presente lavoro.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of competitiveness, for a long time considered as strictly connected to economic and financial performances, evolved, above all in recent years, toward new, wider interpretations disclosing its multidimensional nature. The shift to a multidimensional view of the phenomenon has excited an intense debate involving theoretical reflections on the features characterizing it, as well as methodological considerations on its assessment and measurement. The present research has a twofold objective: going in depth with the study of tangible and intangible aspect characterizing multidimensional competitive phenomena by assuming a micro-level point of view, and measuring competitiveness through a model-based approach. Specifically, we propose a non-parametric approach to Structural Equation Models techniques for the computation of multidimensional composite measures. Structural Equation Models tools will be used for the development of the empirical application on the italian case: a model based micro-level competitiveness indicator for the measurement of the phenomenon on a large sample of Italian small and medium enterprises will be constructed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last 60 years, computers and software have favoured incredible advancements in every field. Nowadays, however, these systems are so complicated that it is difficult – if not challenging – to understand whether they meet some requirement or are able to show some desired behaviour or property. This dissertation introduces a Just-In-Time (JIT) a posteriori approach to perform the conformance check to identify any deviation from the desired behaviour as soon as possible, and possibly apply some corrections. The declarative framework that implements our approach – entirely developed on the promising open source forward-chaining Production Rule System (PRS) named Drools – consists of three components: 1. a monitoring module based on a novel, efficient implementation of Event Calculus (EC), 2. a general purpose hybrid reasoning module (the first of its genre) merging temporal, semantic, fuzzy and rule-based reasoning, 3. a logic formalism based on the concept of expectations introducing Event-Condition-Expectation rules (ECE-rules) to assess the global conformance of a system. The framework is also accompanied by an optional module that provides Probabilistic Inductive Logic Programming (PILP). By shifting the conformance check from after execution to just in time, this approach combines the advantages of many a posteriori and a priori methods proposed in literature. Quite remarkably, if the corrective actions are explicitly given, the reactive nature of this methodology allows to reconcile any deviations from the desired behaviour as soon as it is detected. In conclusion, the proposed methodology brings some advancements to solve the problem of the conformance checking, helping to fill the gap between humans and the increasingly complex technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a universal model of documents and deltas. This model formalize what it means to find differences between documents and to shows a single shared formalization that can be used by any algorithm to describe the differences found between any kind of comparable documents. The main scientific contribution of this thesis is a universal delta model that can be used to represent the changes found by an algorithm. The main part of this model are the formal definition of changes (the pieces of information that records that something has changed), operations (the definitions of the kind of change that happened) and deltas (coherent summaries of what has changed between two documents). The fundamental mechanism tha makes the universal delta model a very expressive tool is the use of encapsulation relations between changes. In the universal delta model, changes are not always simple records of what has changed, they can also be combined into more complex changes that reflects the detection of more meaningful modifications. In addition to the main entities (i.e., changes, operations and deltas), the model describes and defines also documents and the concept of equivalence between documents. As a corollary to the model, there is also an extensible catalog of possible operations that algorithms can detect, used to create a common library of operations, and an UML serialization of the model, useful as a reference when implementing APIs that deal with deltas. The universal delta model presented in this thesis acts as the formal groundwork upon which algorithm can be based and libraries can be implemented. It removes the need to recreate a new delta model and terminology whenever a new algorithm is devised. It also alleviates the problems that toolmakers have when adapting their software to new diff algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La recente Direttiva 31/2010 dell’Unione Europea impone agli stati membri di riorganizzare il quadro legislativo nazionale in materia di prestazione energetica degli edifici, affinchè tutte le nuove costruzioni presentino dal 1° gennaio 2021 un bilancio energetico tendente allo zero; termine peraltro anticipato al 1° gennaio 2019 per gli edifici pubblici. La concezione di edifici a energia “quasi” zero (nZEB) parte dal presupposto di un involucro energeticamente di standard passivo per arrivare a compensare, attraverso la produzione preferibilmente in sito di energia da fonti rinnovabili, gli esigui consumi richiesti su base annuale. In quest’ottica la riconsiderazione delle potenzialità dell’architettura solare individua degli strumenti concreti e delle valide metodologie per supportare la progettazione di involucri sempre più performanti che sfruttino pienamente una risorsa inesauribile, diffusa e alla portata di tutti come quella solare. Tutto ciò in considerazione anche della non più procrastinabile necessità di ridurre il carico energetico imputabile agli edifici, responsabili come noto di oltre il 40% dei consumi mondiali e del 24% delle emissioni di gas climalteranti. Secondo queste premesse la ricerca pone come centrale il tema dell’integrazione dei sistemi di guadagno termico, cosiddetti passivi, e di produzione energetica, cosiddetti attivi, da fonte solare nell’involucro architettonico. Il percorso sia analitico che operativo effettuato si è posto la finalità di fornire degli strumenti metodologici e pratici al progetto dell’architettura, bisognoso di un nuovo approccio integrato mirato al raggiungimento degli obiettivi di risparmio energetico. Attraverso una ricognizione generale del concetto di architettura solare e dei presupposti teorici e terminologici che stanno alla base della stessa, la ricerca ha prefigurato tre tipologie di esito finale: una codificazione delle morfologie ricorrenti nelle realizzazioni solari, un’analisi comparata del rendimento solare nelle principali aggregazioni tipologiche edilizie e una parte importante di verifica progettuale dove sono stati applicati gli assunti delle categorie precedenti

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new control scheme has been presented in this thesis. Based on the NonLinear Geometric Approach, the proposed Active Control System represents a new way to see the reconfigurable controllers for aerospace applications. The presence of the Diagnosis module (providing the estimation of generic signals which, based on the case, can be faults, disturbances or system parameters), mean feature of the depicted Active Control System, is a characteristic shared by three well known control systems: the Active Fault Tolerant Controls, the Indirect Adaptive Controls and the Active Disturbance Rejection Controls. The standard NonLinear Geometric Approach (NLGA) has been accurately investigated and than improved to extend its applicability to more complex models. The standard NLGA procedure has been modified to take account of feasible and estimable sets of unknown signals. Furthermore the application of the Singular Perturbations approximation has led to the solution of Detection and Isolation problems in scenarios too complex to be solved by the standard NLGA. Also the estimation process has been improved, where multiple redundant measuremtent are available, by the introduction of a new algorithm, here called "Least Squares - Sliding Mode". It guarantees optimality, in the sense of the least squares, and finite estimation time, in the sense of the sliding mode. The Active Control System concept has been formalized in two controller: a nonlinear backstepping controller and a nonlinear composite controller. Particularly interesting is the integration, in the controller design, of the estimations coming from the Diagnosis module. Stability proofs are provided for both the control schemes. Finally, different applications in aerospace have been provided to show the applicability and the effectiveness of the proposed NLGA-based Active Control System.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, in developed countries, the excessive food intake, in conjunction with a decreased physical activity, has led to an increase in lifestyle-related diseases, such as obesity, cardiovascular diseases, type -2 diabetes, a range of cancer types and arthritis. The socio-economic importance of such lifestyle-related diseases has encouraged countries to increase their efforts in research, and many projects have been initiated recently in research that focuses on the relationship between food and health. Thanks to these efforts and to the growing availability of technologies, the food companies are beginning to develop healthier food. The necessity of rapid and affordable methods, helping the food industries in the ingredient selection has stimulated the development of in vitro systems that simulate the physiological functions to which the food components are submitted when administrated in vivo. One of the most promising tool now available appears the in vitro digestion, which aims at predicting, in a comparative way among analogue food products, the bioaccessibility of the nutrients of interest.. The adoption of the foodomics approach has been chosen in this work to evaluate the modifications occurring during the in vitro digestion of selected protein-rich food products. The measure of the proteins breakdown was performed via NMR spectroscopy, the only techniques capable of observing, directly in the simulated gastric and duodenal fluids, the soluble oligo- and polypeptides released during the in vitro digestion process. The overall approach pioneered along this PhD work, has been discussed and promoted in a large scientific community, with specialists networked under the INFOGEST COST Action, which recently released a harmonized protocol for the in vitro digestion. NMR spectroscopy, when used in tandem with the in vitro digestion, generates a new concept, which provides an additional attribute to describe the food quality: the comparative digestibility, which measures the improvement of the nutrients bioaccessibility.