22 resultados para Integrated forensic approach
Resumo:
This dissertation deals with the design and the characterization of novel reconfigurable silicon-on-insulator (SOI) devices to filter and route optical signals on-chip. Design is carried out through circuit simulations based on basic circuit elements (Building Blocks, BBs) in order to prove the feasibility of an approach allowing to move the design of Photonic Integrated Circuits (PICs) toward the system level. CMOS compatibility and large integration scale make SOI one of the most promising material to realize PICs. The concepts of generic foundry and BB based circuit simulations for the design are emerging as a solution to reduce the costs and increase the circuit complexity. To validate the BB based approach, the development of some of the most important BBs is performed first. A novel tunable coupler is also presented and it is demonstrated to be a valuable alternative to the known solutions. Two novel multi-element PICs are then analysed: a narrow linewidth single mode resonator and a passband filter with widely tunable bandwidth. Extensive circuit simulations are carried out to determine their performance, taking into account fabrication tolerances. The first PIC is based on two Grating Assisted Couplers in a ring resonator (RR) configuration. It is shown that a trade-off between performance, resonance bandwidth and device footprint has to be performed. The device could be employed to realize reconfigurable add-drop de/multiplexers. Sensitivity with respect to fabrication tolerances and spurious effects is however observed. The second PIC is based on an unbalanced Mach-Zehnder interferometer loaded with two RRs. Overall good performance and robustness to fabrication tolerances and nonlinear effects have confirmed its applicability for the realization of flexible optical systems. Simulated and measured devices behaviour is shown to be in agreement thus demonstrating the viability of a BB based approach to the design of complex PICs.
Resumo:
Internet of Things systems are pervasive systems evolved from cyber-physical to large-scale systems. Due to the number of technologies involved, software development involves several integration challenges. Among them, the ones preventing proper integration are those related to the system heterogeneity, and thus addressing interoperability issues. From a software engineering perspective, developers mostly experience the lack of interoperability in the two phases of software development: programming and deployment. On the one hand, modern software tends to be distributed in several components, each adopting its most-appropriate technology stack, pushing programmers to code in a protocol- and data-agnostic way. On the other hand, each software component should run in the most appropriate execution environment and, as a result, system architects strive to automate the deployment in distributed infrastructures. This dissertation aims to improve the development process by introducing proper tools to handle certain aspects of the system heterogeneity. Our effort focuses on three of these aspects and, for each one of those, we propose a tool addressing the underlying challenge. The first tool aims to handle heterogeneity at the transport and application protocol level, the second to manage different data formats, while the third to obtain optimal deployment. To realize the tools, we adopted a linguistic approach, i.e.\ we provided specific linguistic abstractions that help developers to increase the expressive power of the programming language they use, writing better solutions in more straightforward ways. To validate the approach, we implemented use cases to show that the tools can be used in practice and that they help to achieve the expected level of interoperability. In conclusion, to move a step towards the realization of an integrated Internet of Things ecosystem, we target programmers and architects and propose them to use the presented tools to ease the software development process.
Resumo:
The ever-growing interest in scientific techniques, able to characterise the materials and rediscover the steps behind the execution of a painting, makes them widely accepted in its investigation. This research discusses issues emerging from attribution and authentication studies and proposes best practise for the characterisation of materials and techniques, favouring the contextualisation of the results in an integrated approach; the work aims to systematically classify paintings in categories that aid the examination of objects. A first grouping of paintings is based on the information initially available on them, identifying four categories. A focus of this study is the examination of case studies, spanning from the 16th to the 20th century, to evaluate and validate different protocols associated to each category, to show problems arising from paintings and explain advantages and limits of the approach. The research methodology incorporates a combined set of scientific techniques (non-invasive, such as technical imaging and XRF, micro-invasive, such as optical microscopy, SEM-EDS, FTIR, Raman microscopy and in one case radiocarbon dating) to answer the questions and, if necessary for the classification, exhaustively characterise the materials of the paintings, as the creation and contribution of shared technical databases related to various artists and their evolution over time is an objective tool that benefits this kind of study. The reliability of a close collaboration among different professionals is an essential aspect of this research to comprehensively study a painting, as the integration of stylistic, documentary and provenance studies corroborates the scientific findings and helps in the successful contextualisation of the results and the reconstruction of the history of the object.
Resumo:
Marine healthy ecosystems support life on Earth and human well-being thanks to their biodiversity, which is proven to decline mainly due to anthropogenic stressors. Monitoring how marine biodiversity changes trough space and time is needed to properly define and enroll effective actions towards habitat conservation and preservation. This is particularly needed in those areas that are very rich in species compared to their low surface extension and are characterized by strong anthropic pressures, such as the Mediterranean Sea. Subtidal rocky benthic Mediterranean habitats have a complex structural architecture, hosting a panoply of tiny organisms (cryptofauna) that inhabit crevices and caves, but that are still unknown. Different artificial standardized sampling structures (SSS) and methods have been developed and employed to characterize the cryptofauna, allowing for data replicability and comparability across regions. Organisms growing on these artificial structures can be identified coupling morphological taxonomy and DNA barcoding and metabarcoding. The metabarcoding allows for the identification of organisms in a bulk sample without morphological analysis, and it is based on comparing the genetic similarities of the assessed organisms with barcoding sequences present in online barcoding repositories. Nevertheless, barcoded species nowadays represent only a small portion of known species, and barcoding reference databases are not always curated and updated on a regular basis. In this Thesis I used an integrative approach to characterize benthic marine biodiversity, specifically coupling morphological and molecular techniques with the employment of SSS. Moreover, I upgraded the actual status of COI (cytochrome c oxidase subunit I) barcoding of marine metazoans, and I built a customized COI barcoding reference database for metabarcoding studies on temperate biogenic reefs. This work implemented the knowledge about diversity of Mediterranean marine communities, laying the groundworks for monitoring marine and environmental changes that will occur in the next future as consequences of anthropic and climate threats.
Resumo:
In this doctoral dissertation, a comprehensive methodological approach for the assessment of river embankments safety conditions, based on the integrated use of laboratory testing, physical modelling and finite element (FE) numerical simulations, is proposed, with the aim of contributing to a better understanding of the effect of time-dependent hydraulic boundary conditions on the hydro-mechanical response of river embankments. The case study and materials selected for the present research project are representative for the riverbank systems of Alpine and Apennine tributaries of the main river Po (Northern Italy), which have recently experienced various sudden overall collapses. The outcomes of a centrifuge test carried out under the enhanced gravity field of 50-g, on a riverbank model, made of a compacted silty sand mixture, overlying a homogeneous clayey silt foundation layer and subjected to a simulated flood event, have been considered for the definition of a robust and realistic experimental benchmark. In order to reproduce the observed experimental behaviour, a first set of numerical simulations has been carried out by assuming, for both the embankments and the foundation unit, rigid soil porous media, under partially saturated conditions. Mechanical and hydraulic soil properties adopted in the numerical analyses have been carefully estimated based on standard saturated triaxial, oedometer and constant head permeability tests. Afterwards, advanced suction-controlled laboratory tests, have been carried out to investigate the effect of suction and confining stresses on the shear strength and compressibility characteristics of the filling material and a second set of numerical simulations has been run, taking into account the soil parameters updated based on the most recent tests. The final aim of the study is the quantitative estimation of the predictive capabilities of the calibrated numerical tools, by systematically comparing the results of the FE simulations to the experimental benchmark.
Resumo:
In the literature on philosophical practices, despite the crucial role that argumentation plays in these activities, no specific argumentative theories have ever been proposed to assist the figure of the facilitator in conducting philosophical dialogue and to enhance student’s critical thinking skills. The dissertation starts from a cognitive perspective that challenges the classic Cartesian notion of rationality by focusing on limits and biases of human reasoning. An argumentative model (WRAT – Weak Reasoning Argumentative Theory) is then outlined in order to respond to the needs of philosophical dialogue. After justifying the claim that this learning activity, among other inductive methodologies, is the most suitable for critical thinking education, I inquired into the specific goal of ‘arguing’ within this context by means of the tools provided by Speech Act Theory: the speaker’s intention is to construct new knowledge by questioning her own and other’s beliefs. The model proposed has been theorized on this assumption, starting from which the goals, and, in turn, the related norms, have been pinpointed. In order to include all the epistemic attitudes required to accomplish the complex task of arguing in philosophical dialogue, I needed to integrate two opposed cognitive accounts, Dual Process Theory and Evolutionary Approach, that, although they provide incompatible descriptions of reasoning, can be integrated to provide a normative account of argumentation. The model, apart from offering a theoretical contribution to argumentation studies, is designed to be applied to the Italian educational system, in particular to classes in technical and professional high schools belonging to the newly created network Inventio. This initiative is one of the outcomes of the research project by the same name, which also includes an original Syllabus, research seminars, a monitoring action and publications focused on introducing philosophy, in the form of workshop activities, into technical and professional schools.
Resumo:
The integration of distributed and ubiquitous intelligence has emerged over the last years as the mainspring of transformative advancements in mobile radio networks. As we approach the era of “mobile for intelligence”, next-generation wireless networks are poised to undergo significant and profound changes. Notably, the overarching challenge that lies ahead is the development and implementation of integrated communication and learning mechanisms that will enable the realization of autonomous mobile radio networks. The ultimate pursuit of eliminating human-in-the-loop constitutes an ambitious challenge, necessitating a meticulous delineation of the fundamental characteristics that artificial intelligence (AI) should possess to effectively achieve this objective. This challenge represents a paradigm shift in the design, deployment, and operation of wireless networks, where conventional, static configurations give way to dynamic, adaptive, and AI-native systems capable of self-optimization, self-sustainment, and learning. This thesis aims to provide a comprehensive exploration of the fundamental principles and practical approaches required to create autonomous mobile radio networks that seamlessly integrate communication and learning components. The first chapter of this thesis introduces the notion of Predictive Quality of Service (PQoS) and adaptive optimization and expands upon the challenge to achieve adaptable, reliable, and robust network performance in dynamic and ever-changing environments. The subsequent chapter delves into the revolutionary role of generative AI in shaping next-generation autonomous networks. This chapter emphasizes achieving trustworthy uncertainty-aware generation processes with the use of approximate Bayesian methods and aims to show how generative AI can improve generalization while reducing data communication costs. Finally, the thesis embarks on the topic of distributed learning over wireless networks. Distributed learning and its declinations, including multi-agent reinforcement learning systems and federated learning, have the potential to meet the scalability demands of modern data-driven applications, enabling efficient and collaborative model training across dynamic scenarios while ensuring data privacy and reducing communication overhead.