477 resultados para NBS toolkit
Resumo:
The coastal area along the Emilia-Romagna (ER), in the Italian side of the northern Adriatic Sea, is considered to implement an unstructured numerical ocean model with the aim to develop innovative tools for the coastal management and a forecasting system for the storm surge risk reduction. The Adriatic Sea has been the focus of several studies because of its peculiar dynamics driven by many forcings acting at basin and local scales. The ER coast is particularly exposed to storm surge events. In particular conditions, winds, tides and seicehs may combine and contribute to the flooding of the coastal area. The global sea level rise expected in the next decades will increase even more the hazard along the ER and Adriatic coast. Reliable Adriatic and Mediterranean scale numerical ocean models are now available to allow the dynamical downscaling of very high-resolution models in limited coastal areas. In this work the numerical ocean model SHYFEM is implemented in the Goro lagoon (named GOLFEM) and along the ER coast (ShyfER) to test innovative solutions against sea related coastal hazards. GOLFEM was succesfully applied to analyze the Goro lagoon dynamics and to assess the dynamical effects of human interventions through the analysis of what-if scenarios. The assessment of storm surge hazard in the Goro lagoon was carried out through the development of an ensemble storm surge forecasting system with GOLFEM using forcing from different operational meteorological and ocean models showing the fundamental importance of the boundary conditions. The ShyfER domain is used to investigate innovative solutions against storm surge related hazard along the ER coast. The seagrass is assessed as a nature-based solution (NBS) for coastal protection under present and future climate conditions. The results show negligible effects on sea level but sensible effects in reducing bottom current velocity.
Resumo:
I set out the pros and cons of conferring legal personhood on artificial intelligence systems (AIs), mainly under civil law. I provide functionalist arguments to justify this policy choice and identify the content that such a legal status might have. Although personhood entails holding one or more legal positions, I will focus on the distribution of liabilities arising from unpredictably illegal and harmful conduct. Conferring personhood on AIs might efficiently allocate risks and social costs, ensuring protection for victims, incentives for production, and technological innovation. I also consider other legal positions, e.g., the capacity to act, the ability to hold property, make contracts, and sue (and be sued). However, I contend that even assuming that conferring personhood on AIs finds widespread consensus, its implementation requires solving a coordination problem, determined by three asymmetries: technological, intra-legal systems, and inter-legal systems. I address the coordination problem through conceptual analysis and metaphysical explanation. I first frame legal personhood as a node of inferential links between factual preconditions and legal effects. Yet, this inferentialist reading does not account for the ‘background reasons’, i.e., it does not explain why we group divergent situations under legal personality and how extra-legal information is integrated into it. One way to account for this background is to adopt a neo-institutional perspective and update its ontology of legal concepts with further layers: the meta-institutional and the intermediate. Under this reading, the semantic referent of legal concepts is institutional reality. So, I use notions of analytical metaphysics, such as grounding and anchoring, to explain the origins and constituent elements of legal personality as an institutional kind. Finally, I show that the integration of conceptual and metaphysical analysis can provide the toolkit for finding an equilibrium around the legal-policy choices that are involved in including (or not including) AIs among legal persons.
Resumo:
Neuroblastoma (NB) is the most common type of tumor in infants and the third most common cancer in children. Current clinical practices employ a variety of strategies for NB treatment, ranging from standard chemotherapy to immunotherapy. Due to a lack of knowledge about the molecular mechanisms underlying the disease's onset, aggressive phenotype, and therapeutic resistance, these approaches are ineffective in the majority of instances. MYCN amplification is one of the most well-known genetic alterations associated with high risk in NB. The following work is divided into three sections and aims to provide new insights into the biology of NB and hypothetical new treatment strategies. First, we identified RUNX1T1 as a key gene involved in MYCN-driven NB onset in a transgenic mouse model. Our results suggested that that RUNX1T1 may recruit the Co-REST complex on target genes that regulate the differentiation of NB cells and that the interaction with RCOR3 is essential. Second, we provided insights into the role of MYCN in dysregulating the CDK/RB/E2F pathway controlling the G1/S transition of the cell cycle. We found that RB is dispensable in regulating MYCN amplified NB's cell cycle, providing the rationale for using cyclin/CDK complexes inhibitors in NBs carrying MYCN amplification and relatively high levels of RB1 expression. Third, we generated an M13 bacteriophage platform to target GD2-expressing cells in NB. Here, we generated a recombinant M13 phage capable of binding GD2-expressing cells selectively (M13GD2). Our results showed that M13GD2 chemically conjugated with the photosensitizer ECB04 preserves the retargeting capability, inducing cell death even at picomolar concentrations upon light irradiation. These results provided proof of concept for M13 phage employment in targeted photodynamic therapy for NB, an exciting strategy to overcome resistance to classical immunotherapy.
Resumo:
Ad oggi le città europee si configurano come i principali centri di cultura, innovazione e sviluppo economico. Tuttavia, ospitando circa il 75% della popolazione e consumando quasi l’80% dell’energia prodotta, a causa delle significative emissioni di gas serra esse contribuiscono in modo rilevante ai cambiamenti climatici e, allo stesso tempo, ne subiscono gli effetti più intensi. La Comunità Europea ha preso atto della necessità di intraprendere un’azione sinergica che adotti strategie di mitigazione climatica e preveda misure di adattamento per far fronte agli impatti climatici ormai inevitabili. L'orientamento dei Programmi europei di Ricerca e Innovazione sul tema delle città smart e clima-neutrali sposta l'attenzione dalla dimensione urbana verso la scala di distretto. In questa prospettiva, i Positive Energy Districts (PEDs) si configurano come distretti di nuova edificazione, ma anche come soluzioni ambiziose per la riqualificazione di quartieri esistenti che gestiscono in modo attivo il fabbisogno energetico con un bilancio nullo di emissioni e un surplus di energia prodotta da rinnovabili. La ricerca di dottorato focalizza l’indagine sul modello PEDs esplorandone il potenziale di applicabilità nel contesto urbano consolidato. Nello specifico, la tesi lavora allo sviluppo di due contributi di ricerca originali: il PED-Portfolio e il PED-Toolkit. Tali contributi propongono un approccio sistemico, attraverso il quale intraprendere un percorso di conoscenza e sperimentazione del modello PEDs in una prospettiva di riduzione del fabbisogno energetico, ma anche in un’ottica di migliore accessibilità, vivibilità e resilienza di questi distretti. Al fine di verificare l’applicabilità dei risultati della ricerca, gli strumenti sviluppati vengono testati su un’area pilota e gli esiti di tale sperimentazione sono poi messi a confronto con il quadro dello stato dell’arte e con le principali linee di ricerca internazionali sul tema PEDs, affinando gli esiti del progetto di dottorato in un processo di ricerca-sperimentazione-ricerca.
Resumo:
The Nature-Based Solutions (NBS) concept and approach were developed to simultaneously face challenges such as risk mitigation and biodiversity conservation and restoration. NBSs have been endorsed by major International Organizations such as the EU, the FAO and World Bank that are pushing to enable a mainstreaming process. However, a shift from traditional engineering “grey” solutions to wider and standard adoption of NBS encounters technical, social, cultural, and normative barriers that have been identified with a qualitative content analysis of policy documents, reports and expert interviews. The case of the region Emilia-Romagna was studied by developing an analytical framework that brought together the social-ecological context, the governance system and the characteristics of specific NBSs.
Resumo:
The following thesis work focuses on the use and implementation of advanced models for measuring the resilience of water distribution networks. In particular, the functions implemented in GRA Tool, a software developed by the University of Exeter (UK), and the functions of the Toolkit of Epanet 2.2 were investigated. The study of the resilience and failure, obtained through GRA Tool and the development of the methodology based on the combined use of EPANET 2.2 and MATLAB software, was tested in a first phase, on a small-sized literature water distribution network, so that the variability of the results could be perceived more clearly and with greater immediacy, and then, on a more complex network, that of Modena. In the specific, it has been decided to go to recreate a mode of failure deferred in time, one proposed by the software GRA Tool, that is failure to the pipes, to make a comparison between the two methodologies. The analysis of hydraulic efficiency was conducted using a synthetic and global network performance index, i.e., Resilience index, introduced by Todini in the years 2000-2016. In fact, this index, being one of the parameters with which to evaluate the overall state of "hydraulic well-being" of a network, has the advantage of being able to act as a criterion for selecting any improvements to be made on the network itself. Furthermore, during these analyzes, was shown the analytical development undergone over time by the formula of the Resilience Index. The final intent of this thesis work was to understand by what means to improve the resilience of the system in question, as the introduction of the scenario linked to the rupture of the pipelines was designed to be able to identify the most problematic branches, i.e., those that in the event of a failure it would entail greater damage to the network, including lowering the Resilience Index.
Resumo:
Stellar occultations are the most accurate Earth-based astronomy technique to obtain the lateral position of celestial bodies, in the case of natural satellites, their accuracy also depends on the central body to which the satellite orbits. The main goal of this thesis work is to analyze if and how very long baseline interferometry (VLBI) measurements of a body like Jupiter can be used in support to stellar occultations of its natural satellites by reducing the planetary uncertainty at the time of the occultation. In particular, we analyzed the events of the stellar occultations of Callisto (15.01.2024) and Io (02.04.2021). The stellar occultation of Callisto has been predicted and simulated using the stellar occultation reduction analysis (SORA) toolkit while the stellar occultation of Io has been already studied by Morgado et al. We then simulated the VLBI data of Jupiter according to the current JUNO trajectories. The required observation were then used as input of an estimation to which then we performed a covariance analysis on the estimated parameters to retrieve the formal errors (1 − σ uncertainties) at each epoch of the propagation. The results show that the addition of the VLBI slightly improves the uncertainty of Callisto even when Jupiter knowledge is worse while for Io we observed that the VLBI data is especially crucial in the scenario of an a priori uncertainty in Jupiter state of about 10km. Here we can have improvements of the estimated initial states of Io of about 70m, 230m and 900m to the radial, along-track and cross-track directions respectively. Moreover, we have also obtained the propagated errors of the two moons in terms of right ascension and declination which both show uncertainties in the mas level at the occultation time. Finally, we simulated Io and Europa together and we observed that at the time of the stellar occultation of Europa the along-track component of Io is constrained, confirming the coupling between the two inner moons.
Resumo:
L'obbiettivo della seguente tesi è quello di analizzare quali sono ad oggi i migliori framework per lo sviluppo di software in Mixed Reality e studiare i design pattern più utili ad uno sviluppatore in questo ambito. Nel primo capitolo vengono introdotti i concetti di realtà estesa, virtuale, aumentata e mista con le relative differenze. Inoltre vengono descritti i diversi dispositivi che consentono la realtà mista, in particolare i due visori più utilizzati: Microsoft Hololens 2 e Magic Leap 1. Nello stesso capitolo vengono presentati anche gli aspetti chiave nello sviluppo in realtà mista, cioè tutti gli elementi che consentono un'esperienza in Mixed Reality. Nel secondo capitolo vengono descritti i framework e i kit utili per lo sviluppo di applicazioni in realtà mista multi-piattaforma. In particolare vengono introdotti i due ambienti di sviluppo più utilizzati: Unity e Unreal Engine, già esistenti e non specifici per lo sviluppo in MR ma che diventano funzionali se integrati con kit specifici come Mixed Reality ToolKit. Nel terzo capitolo vengono trattati i design pattern, comuni o nativi per applicazioni in realtà estesa, utili per un buono sviluppo di applicazioni MR. Inoltre, vengono presi in esame alcuni dei principali pattern più utilizzati nella programmazione ad oggetti e si verifica se e come sono implementabili correttamente su Unity in uno scenario di realtà mista. Questa analisi risulta utile per capire se l'utilizzo dei framework di sviluppo, metodo comunemente più utilizzato, comporta dei limiti nella libertà di sviluppo del programmatore.
Resumo:
The need for sustainable economic growth and environmental stewardship emerged around the start of the twentieth century when society became aware that the traditional development model would lead to the collapse of the terrestrial ecosystem in the long run. Over the years, the international community's environmental efforts have demonstrated unequivocally that the planet's limits are real. And so, the new development approach has laid the groundwork for the future. According to this model, design also plays a key role in ensuring a better future. The design has undergone an ecological and sustainable evolution as a result of the global environmental crisis and the degradation of our ecosystem and biodiversity. In this contest, Prosperity Thinking is inserted, a still evolving methodology developed by the Future Food Institute starting from 2019. The main concepts on which it is based are described, as well as the method that identifies it, which is divided into the following stages: 1) Problem Framing 2) Ideation and Prototyping 3) Test & Analyze. The development of the prosperity thinking toolkit is described, beginning with the search for tools from the literature on sustainable design and ending with its validation with the help of design experts. The testing of some tools will be recounted during a workshop organized by FFI, in which 15 people ranging in age from 14 to 40 will participate, and then the final version of the toolkit will be presented which has been obtained by adding to it the tools proposed by the experts. Finally, a reflection on the future of Prosperity Thinking, a method in constant evolution that must continue to follow societal and environmental changes in order to respond to the ever-increasingly complex challenge of sustainability.
Resumo:
This thesis has been written as a result of the Language Toolkit project, organised by the Department of Interpreting and Translation of Forlì in collaboration with the Chamber of Commerce of Romagna. The aim of the project is to facilitate the internationalisation of small and medium enterprises in Romagna by making available to them the skills acquired by the students of the Faculty of Specialized Translation, who in turn are given the opportunity to approach an authentic professional context. Specifically, this thesis is the outcome of the 300-hour internship envisaged by the project, 75 of which were carried out at Jopla S.r.l. SB. The task assigned to the student was the translation into French of the Jopla For You web app and the Jopla PRO mobile app. This thesis consists of five chapters. The first chapter provides a general description of the Language Toolkit project and it focuses on the concept of translation into a non-native language. The second chapter outlines the theoretical context in which translation is set. Subsequently, the focus shifts to the topics of text, discourse, genre and textual typology, alongside a reflection on the applicability of these notions to web texts, and an analysis of the source text following Nord's model. The fourth chapter is dedicated to a description of the resources used in the preparation and translation phases. The fifth chapter describes the macro and micro strategies employed to carry out the translation. Furthermore, a comparative analysis between the human translation and the one provided by Google Translator is delivered. This analysis involves two methods: the first one follows the linguistic norms of the target language, while the second one relies on the error categorisation of the MQM model. Finally, the performance of Google Translate is investigated through the comparison of the results obtained from the MQM evaluation conducted in this thesis with the results obtained by Martellini (2021) in her analysis.
Resumo:
The participation in the Language Toolkit program—a joint initiative of the Department of Interpretation and Translation of Forlì (University of Bologna) and the Chamber of Commerce of Romagna—led to the creation of this dissertation. This program aims to support the internationalization of SMEs while also introducing near-graduates to a real professional context. The author collaborated on this project with Leonori, a company that produces and sells high jewelry products online and through retailers. The purpose of this collaboration is to translate and localize a part of the company website from Italian into Chinese, so as to facilitate their internationalization process to Chinese-speaking countries. This dissertation is organized according to the usual stages a translator goes through when carrying out a translation task. First and foremost, however, it was necessary to provide a theoretical background pertaining to the topics of the project. Specifically, the first chapter introduces the Language Toolkit program, the concept of internationalization and the company itself. The second chapter is dedicated to the topic of inverse translation, including the advantages and disadvantages of this practice. The third chapter highlights the main features of localization, with a particular emphasis on web localization. The fourth chapter deals with the analysis of the source text, according to the looping model developed by Nord (1991). The fifth chapter describes in detail the methods implemented for the creation of the language resources i.e., two comparable monolingual corpora and a termbase, which were built ad hoc for this specific project. In the sixth chapter all the translation strategies that were implemented are outlined, providing some examples from the source text. The final chapter describes the revision process, which occurred both during and after the translation phase.
Resumo:
Esperimenti di radar bistatico sono stati impiegati con successo nell’esplorazione spaziale ai fini di sondare a distanza superfici planetarie attraverso la riflessione di un segnale radio da parte di un corpo bersaglio. Un'appropriata analisi degli echi riflessi può fornire informazioni sulla struttura, sulla composizione chimica e sulla rugosità della superficie del target su scale proporzionali alle lunghezze d’onda trasmesse. Nel seguente studio si propone la modellazione della geometria del collegamento radio tra JUICE e la Terra per trovare opportunità per la sonda di eseguire esperimenti di radar bistatico sulla superficie di Ganimede, durante i soli flyby della luna. Questi, anche se ancora non è stato programmato con dettaglio nella fase scientifica della missione, potrebbero coincidere con finestre temporali plausibili per l’implementazione degli esperimenti analizzati. Ulteriori considerazioni vertono poi sull’angolo di incidenza e sull’effetto che questo avrà sull’accuratezza della stima della costante dielettrica superficiale della luna, effettuabile con osservazioni bistatiche. L’algoritmo principale per il calcolo del punto speculare e i grafici presentati, sono stati implementati con l’ausilio del software MATLAB e del toolkit SPICE. I risultati ottenuti, analizzando i flyby presi a riferimento, mostrano come la geometria della missione, per la maggior parte di essi, non sia la più favorevole per poter effettuare questo tipo di osservazione. Solo tre dei sette flyby analizzati: G04, G05 e G06, risultano avere una geometria favorevole per esperimenti di radar. bistatico su Ganimede.