16 resultados para Semantic Web and its applications
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Principale obiettivo della ricerca è quello di ricostruire lo stato dell’arte in materia di sanità elettronica e Fascicolo Sanitario Elettronico, con una precipua attenzione ai temi della protezione dei dati personali e dell’interoperabilità. A tal fine sono stati esaminati i documenti, vincolanti e non, dell’Unione europea nonché selezionati progetti europei e nazionali (come “Smart Open Services for European Patients” (EU); “Elektronische Gesundheitsakte” (Austria); “MedCom” (Danimarca); “Infrastruttura tecnologica del Fascicolo Sanitario Elettronico”, “OpenInFSE: Realizzazione di un’infrastruttura operativa a supporto dell’interoperabilità delle soluzioni territoriali di fascicolo sanitario elettronico nel contesto del sistema pubblico di connettività”, “Evoluzione e interoperabilità tecnologica del Fascicolo Sanitario Elettronico”, “IPSE - Sperimentazione di un sistema per l’interoperabilità europea e nazionale delle soluzioni di Fascicolo Sanitario Elettronico: componenti Patient Summary e ePrescription” (Italia)). Le analisi giuridiche e tecniche mostrano il bisogno urgente di definire modelli che incoraggino l’utilizzo di dati sanitari ed implementino strategie effettive per l’utilizzo con finalità secondarie di dati sanitari digitali , come Open Data e Linked Open Data. L’armonizzazione giuridica e tecnologica è vista come aspetto strategico per ridurre i conflitti in materia di protezione di dati personali esistenti nei Paesi membri nonché la mancanza di interoperabilità tra i sistemi informativi europei sui Fascicoli Sanitari Elettronici. A questo scopo sono state individuate tre linee guida: (1) armonizzazione normativa, (2) armonizzazione delle regole, (3) armonizzazione del design dei sistemi informativi. I principi della Privacy by Design (“prottivi” e “win-win”), così come gli standard del Semantic Web, sono considerate chiavi risolutive per il suddetto cambiamento.
Resumo:
This work is concerned with the increasing relationships between two distinct multidisciplinary research fields, Semantic Web technologies and scholarly publishing, that in this context converge into one precise research topic: Semantic Publishing. In the spirit of the original aim of Semantic Publishing, i.e. the improvement of scientific communication by means of semantic technologies, this thesis proposes theories, formalisms and applications for opening up semantic publishing to an effective interaction between scholarly documents (e.g., journal articles) and their related semantic and formal descriptions. In fact, the main aim of this work is to increase the users' comprehension of documents and to allow document enrichment, discovery and linkage to document-related resources and contexts, such as other articles and raw scientific data. In order to achieve these goals, this thesis investigates and proposes solutions for three of the main issues that semantic publishing promises to address, namely: the need of tools for linking document text to a formal representation of its meaning, the lack of complete metadata schemas for describing documents according to the publishing vocabulary, and absence of effective user interfaces for easily acting on semantic publishing models and theories.
Resumo:
My doctoral research is about the modelling of symbolism in the cultural heritage domain, and on connecting artworks based on their symbolism through knowledge extraction and representation techniques. In particular, I participated in the design of two ontologies: one models the relationships between a symbol, its symbolic meaning, and the cultural context in which the symbol symbolizes the symbolic meaning; the second models artistic interpretations of a cultural heritage object from an iconographic and iconological (thus also symbolic) perspective. I also converted several sources of unstructured data, a dictionary of symbols and an encyclopaedia of symbolism, and semi-structured data, DBpedia and WordNet, to create HyperReal, the first knowledge graph dedicated to conventional cultural symbolism. By making use of HyperReal's content, I showed how linked open data about cultural symbolism could be utilized to initiate a series of quantitative studies that analyse (i) similarities between cultural contexts based on their symbologies, (ii) broad symbolic associations, (iii) specific case studies of symbolism such as the relationship between symbols, their colours, and their symbolic meanings. Moreover, I developed a system that can infer symbolic, cultural context-dependent interpretations from artworks according to what they depict, envisioning potential use cases for museum curation. I have then re-engineered the iconographic and iconological statements of Wikidata, a widely used general-domain knowledge base, creating ICONdata: an iconographic and iconological knowledge graph. ICONdata was then enriched with automatic symbolic interpretations. Subsequently, I demonstrated the significance of enhancing artwork information through alignment with linked open data related to symbolism, resulting in the discovery of novel connections between artworks. Finally, I contributed to the creation of a software application. This application leverages established connections, allowing users to investigate the symbolic expression of a concept across different cultural contexts through the generation of a three-dimensional exhibition of artefacts symbolising the chosen concept.
Resumo:
This work provides a forward step in the study and comprehension of the relationships between stochastic processes and a certain class of integral-partial differential equation, which can be used in order to model anomalous diffusion and transport in statistical physics. In the first part, we brought the reader through the fundamental notions of probability and stochastic processes, stochastic integration and stochastic differential equations as well. In particular, within the study of H-sssi processes, we focused on fractional Brownian motion (fBm) and its discrete-time increment process, the fractional Gaussian noise (fGn), which provide examples of non-Markovian Gaussian processes. The fGn, together with stationary FARIMA processes, is widely used in the modeling and estimation of long-memory, or long-range dependence (LRD). Time series manifesting long-range dependence, are often observed in nature especially in physics, meteorology, climatology, but also in hydrology, geophysics, economy and many others. We deepely studied LRD, giving many real data examples, providing statistical analysis and introducing parametric methods of estimation. Then, we introduced the theory of fractional integrals and derivatives, which indeed turns out to be very appropriate for studying and modeling systems with long-memory properties. After having introduced the basics concepts, we provided many examples and applications. For instance, we investigated the relaxation equation with distributed order time-fractional derivatives, which describes models characterized by a strong memory component and can be used to model relaxation in complex systems, which deviates from the classical exponential Debye pattern. Then, we focused in the study of generalizations of the standard diffusion equation, by passing through the preliminary study of the fractional forward drift equation. Such generalizations have been obtained by using fractional integrals and derivatives of distributed orders. In order to find a connection between the anomalous diffusion described by these equations and the long-range dependence, we introduced and studied the generalized grey Brownian motion (ggBm), which is actually a parametric class of H-sssi processes, which have indeed marginal probability density function evolving in time according to a partial integro-differential equation of fractional type. The ggBm is of course Non-Markovian. All around the work, we have remarked many times that, starting from a master equation of a probability density function f(x,t), it is always possible to define an equivalence class of stochastic processes with the same marginal density function f(x,t). All these processes provide suitable stochastic models for the starting equation. Studying the ggBm, we just focused on a subclass made up of processes with stationary increments. The ggBm has been defined canonically in the so called grey noise space. However, we have been able to provide a characterization notwithstanding the underline probability space. We also pointed out that that the generalized grey Brownian motion is a direct generalization of a Gaussian process and in particular it generalizes Brownain motion and fractional Brownain motion as well. Finally, we introduced and analyzed a more general class of diffusion type equations related to certain non-Markovian stochastic processes. We started from the forward drift equation, which have been made non-local in time by the introduction of a suitable chosen memory kernel K(t). The resulting non-Markovian equation has been interpreted in a natural way as the evolution equation of the marginal density function of a random time process l(t). We then consider the subordinated process Y(t)=X(l(t)) where X(t) is a Markovian diffusion. The corresponding time-evolution of the marginal density function of Y(t) is governed by a non-Markovian Fokker-Planck equation which involves the same memory kernel K(t). We developed several applications and derived the exact solutions. Moreover, we considered different stochastic models for the given equations, providing path simulations.
Resumo:
Electromagnetic spectrum can be identified as a resource for the designer, as well as for the manufacturer, from two complementary points of view: first, because it is a good in great demand by many different kind of applications; second, because despite its scarce availability, it may be advantageous to use more spectrum than necessary. This is the case of Spread-Spectrum Systems, those systems in which the transmitted signal is spread over a wide frequency band, much wider, in fact, than the minimum bandwidth required to transmit the information being sent. Part I of this dissertation deals with Spread-Spectrum Clock Generators (SSCG) aiming at reducing Electro Magnetic Interference (EMI) of clock signals in integrated circuits (IC) design. In particular, the modulation of the clock and the consequent spreading of its spectrum are obtained through a random modulating signal outputted by a chaotic map, i.e. a discrete-time dynamical system showing chaotic behavior. The advantages offered by this kind of modulation are highlighted. Three different prototypes of chaos-based SSCG are presented in all their aspects: design, simulation, and post-fabrication measurements. The third one, operating at a frequency equal to 3GHz, aims at being applied to Serial ATA, standard de facto for fast data transmission to and from Hard Disk Drives. The most extreme example of spread-spectrum signalling is the emerging ultra-wideband (UWB) technology, which proposes the use of large sections of the radio spectrum at low amplitudes to transmit high-bandwidth digital data. In part II of the dissertation, two UWB applications are presented, both dealing with the advantages as well as with the challenges of a wide-band system, namely: a chaos-based sequence generation method for reducing Multiple Access Interference (MAI) in Direct Sequence UWB Wireless-Sensor-Networks (WSNs), and design and simulations of a Low-Noise Amplifier (LNA) for impulse radio UWB. This latter topic was studied during a study-abroad period in collaboration with Delft University of Technology, Delft, Netherlands.
Resumo:
Two of the main features of today complex software systems like pervasive computing systems and Internet-based applications are distribution and openness. Distribution revolves around three orthogonal dimensions: (i) distribution of control|systems are characterised by several independent computational entities and devices, each representing an autonomous and proactive locus of control; (ii) spatial distribution|entities and devices are physically distributed and connected in a global (such as the Internet) or local network; and (iii) temporal distribution|interacting system components come and go over time, and are not required to be available for interaction at the same time. Openness deals with the heterogeneity and dynamism of system components: complex computational systems are open to the integration of diverse components, heterogeneous in terms of architecture and technology, and are dynamic since they allow components to be updated, added, or removed while the system is running. The engineering of open and distributed computational systems mandates for the adoption of a software infrastructure whose underlying model and technology could provide the required level of uncoupling among system components. This is the main motivation behind current research trends in the area of coordination middleware to exploit tuple-based coordination models in the engineering of complex software systems, since they intrinsically provide coordinated components with communication uncoupling and further details in the references therein. An additional daunting challenge for tuple-based models comes from knowledge-intensive application scenarios, namely, scenarios where most of the activities are based on knowledge in some form|and where knowledge becomes the prominent means by which systems get coordinated. Handling knowledge in tuple-based systems induces problems in terms of syntax - e.g., two tuples containing the same data may not match due to differences in the tuple structure - and (mostly) of semantics|e.g., two tuples representing the same information may not match based on a dierent syntax adopted. Till now, the problem has been faced by exploiting tuple-based coordination within a middleware for knowledge intensive environments: e.g., experiments with tuple-based coordination within a Semantic Web middleware (surveys analogous approaches). However, they appear to be designed to tackle the design of coordination for specic application contexts like Semantic Web and Semantic Web Services, and they result in a rather involved extension of the tuple space model. The main goal of this thesis was to conceive a more general approach to semantic coordination. In particular, it was developed the model and technology of semantic tuple centres. It is adopted the tuple centre model as main coordination abstraction to manage system interactions. A tuple centre can be seen as a programmable tuple space, i.e. an extension of a Linda tuple space, where the behaviour of the tuple space can be programmed so as to react to interaction events. By encapsulating coordination laws within coordination media, tuple centres promote coordination uncoupling among coordinated components. Then, the tuple centre model was semantically enriched: a main design choice in this work was to try not to completely redesign the existing syntactic tuple space model, but rather provide a smooth extension that { although supporting semantic reasoning { keep the simplicity of tuple and tuple matching as easier as possible. By encapsulating the semantic representation of the domain of discourse within coordination media, semantic tuple centres promote semantic uncoupling among coordinated components. The main contributions of the thesis are: (i) the design of the semantic tuple centre model; (ii) the implementation and evaluation of the model based on an existent coordination infrastructure; (iii) a view of the application scenarios in which semantic tuple centres seem to be suitable as coordination media.
Resumo:
Modern embedded systems embrace many-core shared-memory designs. Due to constrained power and area budgets, most of them feature software-managed scratchpad memories instead of data caches to increase the data locality. It is therefore programmers’ responsibility to explicitly manage the memory transfers, and this make programming these platform cumbersome. Moreover, complex modern applications must be adequately parallelized before they can the parallel potential of the platform into actual performance. To support this, programming languages were proposed, which work at a high level of abstraction, and rely on a runtime whose cost hinders performance, especially in embedded systems, where resources and power budget are constrained. This dissertation explores the applicability of the shared-memory paradigm on modern many-core systems, focusing on the ease-of-programming. It focuses on OpenMP, the de-facto standard for shared memory programming. In a first part, the cost of algorithms for synchronization and data partitioning are analyzed, and they are adapted to modern embedded many-cores. Then, the original design of an OpenMP runtime library is presented, which supports complex forms of parallelism such as multi-level and irregular parallelism. In the second part of the thesis, the focus is on heterogeneous systems, where hardware accelerators are coupled to (many-)cores to implement key functional kernels with orders-of-magnitude of speedup and energy efficiency compared to the “pure software” version. However, three main issues rise, namely i) platform design complexity, ii) architectural scalability and iii) programmability. To tackle them, a template for a generic hardware processing unit (HWPU) is proposed, which share the memory banks with cores, and the template for a scalable architecture is shown, which integrates them through the shared-memory system. Then, a full software stack and toolchain are developed to support platform design and to let programmers exploiting the accelerators of the platform. The OpenMP frontend is extended to interact with it.
Resumo:
The purpose of this thesis is the atomic-scale simulation of the crystal-chemical and physical (phonon, energetic) properties of some strategically important minerals for structural ceramics, biomedical and petrological applications. These properties affect the thermodynamic stability and rule the mineral-environment interface phenomena, with important economical, (bio)technological, petrological and environmental implications. The minerals of interest belong to the family of phyllosilicates (talc, pyrophyllite and muscovite) and apatite (OHAp), chosen for their importance in industrial and biomedical applications (structural ceramics) and petrophysics. In this thesis work we have applicated quantum mechanics methods, formulas and knowledge to the resolution of mineralogical problems ("Quantum Mineralogy”). The chosen theoretical approach is the Density Functional Theory (DFT), along with periodic boundary conditions to limit the portion of the mineral in analysis to the crystallographic cell and the hybrid functional B3LYP. The crystalline orbitals were simulated by linear combination of Gaussian functions (GTO). The dispersive forces, which are important for the structural determination of phyllosilicates and not properly con-sidered in pure DFT method, have been included by means of a semi-empirical correction. The phonon and the mechanical properties were also calculated. The equation of state, both in athermal conditions and in a wide temperature range, has been obtained by means of variations in the volume of the cell and quasi-harmonic approximation. Some thermo-chemical properties of the minerals (isochoric and isobaric thermal capacity) were calculated, because of their considerable applicative importance. For the first time three-dimensional charts related to these properties at different pressures and temperatures were provided. The hydroxylapatite has been studied from the standpoint of structural and phonon properties for its biotechnological role. In fact, biological apatite represents the inorganic phase of vertebrate hard tissues. Numerous carbonated (hydroxyl)apatite structures were modelled by QM to cover the broadest spectrum of possible biological structural variations to fulfil bioceramics applications.
Resumo:
Landslide hazard and risk are growing as a consequence of climate change and demographic pressure. Land‐use planning represents a powerful tool to manage this socio‐economic problem and build sustainable and landslide resilient communities. Landslide inventory maps are a cornerstone of land‐use planning and, consequently, their quality assessment represents a burning issue. This work aimed to define the quality parameters of a landslide inventory and assess its spatial and temporal accuracy with regard to its possible applications to land‐use planning. In this sense, I proceeded according to a two‐steps approach. An overall assessment of the accuracy of data geographic positioning was performed on four case study sites located in the Italian Northern Apennines. The quantification of the overall spatial and temporal accuracy, instead, focused on the Dorgola Valley (Province of Reggio Emilia). The assessment of spatial accuracy involved a comparison between remotely sensed and field survey data, as well as an innovative fuzzylike analysis of a multi‐temporal landslide inventory map. Conversely, long‐ and short‐term landslide temporal persistence was appraised over a period of 60 years with the aid of 18 remotely sensed image sets. These results were eventually compared with the current Territorial Plan for Provincial Coordination (PTCP) of the Province of Reggio Emilia. The outcome of this work suggested that geomorphologically detected and mapped landslides are a significant approximation of a more complex reality. In order to convey to the end‐users this intrinsic uncertainty, a new form of cartographic representation is needed. In this sense, a fuzzy raster landslide map may be an option. With regard to land‐use planning, landslide inventory maps, if appropriately updated, confirmed to be essential decision‐support tools. This research, however, proved that their spatial and temporal uncertainty discourages any direct use as zoning maps, especially when zoning itself is associated to statutory or advisory regulations.
Resumo:
In the first chapter, I develop a panel no-cointegration test which extends Pesaran, Shin and Smith (2001)'s bounds test to the panel framework by considering the individual regressions in a Seemingly Unrelated Regression (SUR) system. This allows to take into account unobserved common factors that contemporaneously affect all the units of the panel and provides, at the same time, unit-specific test statistics. Moreover, the approach is particularly suited when the number of individuals of the panel is small relatively to the number of time series observations. I develop the algorithm to implement the test and I use Monte Carlo simulation to analyze the properties of the test. The small sample properties of the test are remarkable, compared to its single equation counterpart. I illustrate the use of the test through a test of Purchasing Power Parity in a panel of EU15 countries. In the second chapter of my PhD thesis, I verify the Expectation Hypothesis of the Term Structure in the repurchasing agreements (repo) market with a new testing approach. I consider an "inexact" formulation of the EHTS, which models a time-varying component in the risk premia and I treat the interest rates as a non-stationary cointegrated system. The effect of the heteroskedasticity is controlled by means of testing procedures (bootstrap and heteroskedasticity correction) which are robust to variance and covariance shifts over time. I fi#nd that the long-run implications of EHTS are verified. A rolling window analysis clarifies that the EHTS is only rejected in periods of turbulence of #financial markets. The third chapter introduces the Stata command "bootrank" which implements the bootstrap likelihood ratio rank test algorithm developed by Cavaliere et al. (2012). The command is illustrated through an empirical application on the term structure of interest rates in the US.
Resumo:
The primary aim of the research activity presented in this PhD thesis was the development of an innovative hardware and software solution for creating a unique tool for kinematics and electromyographic analysis of the human body in an ecological setting. For this purpose, innovative algorithms have been proposed regarding different aspects of inertial and magnetic data elaboration: magnetometer calibration and magnetic field mapping (Chapter 2), data calibration (Chapter 3) and sensor-fusion algorithm. Topics that may conflict with the confidentiality agreement between University of Bologna and NCS Lab will not be covered in this thesis. After developing and testing the wireless platform, research activities were focused on its clinical validation. The first clinical study aimed to evaluate the intra and interobserver reproducibility in order to evaluate three-dimensional humero-scapulo-thoracic kinematics in an outpatient setting (Chapter 4). A second study aimed to evaluate the effect of Latissimus Dorsi Tendon Transfer on shoulder kinematics and Latissimus Dorsi activation in humerus intra - extra rotations (Chapter 5). Results from both clinical studies have demonstrated the ability of the developed platform to enter into daily clinical practice, providing useful information for patients' rehabilitation.
Resumo:
This thesis deals with robust adaptive control and its applications, and it is divided into three main parts. The first part is about the design of robust estimation algorithms based on recursive least squares. First, we present an estimator for the frequencies of biased multi-harmonic signals, and then an algorithm for distributed estimation of an unknown parameter over a network of adaptive agents. In the second part of this thesis, we consider a cooperative control problem over uncertain networks of linear systems and Kuramoto systems, in which the agents have to track the reference generated by a leader exosystem. Since the reference signal is not available to each network node, novel distributed observers are designed so as to reconstruct the reference signal locally for each agent, and therefore decentralizing the problem. In the third and final part of this thesis, we consider robust estimation tasks for mobile robotics applications. In particular, we first consider the problem of slip estimation for agricultural tracked vehicles. Then, we consider a search and rescue application in which we need to drive an unmanned aerial vehicle as close as possible to the unknown (and to be estimated) position of a victim, who is buried under the snow after an avalanche event. In this thesis, robustness is intended as an input-to-state stability property of the proposed identifiers (sometimes referred to as adaptive laws), with respect to additive disturbances, and relative to a steady-state trajectory that is associated with a correct estimation of the unknown parameter to be found.
Resumo:
Laser Cladding (LC) is an emerging technology which is used both for coating applications as well as near-net shape fabrication. Despite its significant advantages, such as low dilution and metallurgical bond with the substrate, it still faces issues such as process control and repeatability, which restricts the extension to its applications. The following thesis evaluates the LC technology and tests its potential to be applied to reduce particulate matter emissions from the automotive and locomotive sector. The evaluation of LC technology was carried out for the deposition of multi-layer and multi-track coatings. 316L stainless steel coatings were deposited to study the minimisation of geometric distortions in thin-walled samples. Laser power, as well as scan strategy, were the main variables to achieve this goal. The use of constant power, reduction at successive layers, a control loop control system, and two different scan strategies were studied. The closed-loop control system was found to be practical only when coupled with the correct scan strategy for the deposition of thin walls. Three overlapped layers of aluminium bronze were deposited onto a structural steel pipe for multitrack coatings. The effect of laser power, scan speed and hatch distance on the final geometry of coating were studied independently, and a combined parameter was established to effectively control each geometrical characteristic (clad width, clad height and percentage of dilution). LC was then applied to coat commercial GCI brake discs with tool steel. The optical micrography showed that even with preheating, the cracks that originated from the substrate towards the coating were still present. The commercial brake discs emitted airborne particles whose concentration and size depended on the test conditions used for simulation in the laboratory. The contact of LC cladded wheel with rail emitted significantly less ultra-fine particles while maintaining the acceptable values of coefficient of friction.
Resumo:
The final goal of the bioassay developed during the first two years of my Ph.D. was its application for the screening of antioxidant activity of nutraceuticals and for monitoring the intracellular H2O2 production in peripheral blood mononuclear cells (PBMCs) from hypercholesterolemic subjects before and after two months treatment with Evolocumab, a new generation LDL-cholesterol lowering drug. Moreover, a recombinant bioluminescent protein was developed during the last year using the Baculovirus expression system in insect cells. In particular, the protein combines the extracellular domain (ECD) of the Notch high affinity mutated form of one of the selective Notch ligands defined as Jagged 1 (Jag1) with a red emitting firefly luciferase since a pivotal role of “aberrant” Notch signaling activation in colorectal cancer (CRC) was reported. The probe was validated and characterized in terms of analytical performance and through imaging experiments, in order to understand if Jagged1-FLuc binding correlates with a Notch signaling overexpression and activation in CRC progression.
Resumo:
Personal archives are the archives created by individuals for their own purposes. Among these are the library and documentary collections of writers and scholars. It is only recently that archival literature has begun to focus on this category of archives, emphasising how their heterogeneous nature necessitates the conciliation of different approaches to archival description, and calling for a broader understanding of the principle of provenance, recognising that multiple creators, including subsequent researchers, can contribute to shaping personal archives over time by adding new layers of contexts. Despite these advances in the theoretical debate, current architectures for archival representation remain behind. Finding aids privilege a single point of view and do not allow subsequent users to embed their own, potentially conflicting, readings. Using semantic web technologies this study aims to define a conceptual model for writers' archives based on existing and widely adopted models in the cultural heritage and humanities domains. The model developed can be used to represent different types of documents at various levels of analysis, as well as record content and components. It also enables the representation of complex relationships and the incorporation of additional layers of interpretation into the finding aid, transforming it from a static search tool into a dynamic research platform. The personal archive and library of Giuseppe Raimondi serves as a case study for the creation of an archival knowledge base using the proposed conceptual model. By querying the knowledge graph through SPARQL, the effectiveness of the model is evaluated. The results demonstrate that the model addresses the primary representation challenges identified in archival literature, from both a technological and methodological standpoint. The ultimate goal is to bring the output par excellence of archival science, i.e. the finding aid, more in line with the latest developments in archival thinking.