5 resultados para Process-dissociation Framework
em AMS Tesi di Laurea - Alm@DL - Università di Bologna
Resumo:
Il potenziale che il Web ha raggiunto in termini di accessibilità, gestione e flessibilità, ha spinto i più svariati settori ad approcciarsi ad esso e ad adottarlo all’interno delle proprie organizzazioni. É stato quindi necessario applicare alle tradizionali applicazioni web, nuove soluzioni al fine di integrare gli elementi di workflow management con il modello dei dati di navigazione e di presentazione. In questo lavoro di tesi, si affrontano gli aspetti legati ai processi di business, con riferimento alla progettazione e allo sviluppo di applicazioni Web. Verranno introdotti standard di modellazione come UML e BPMN per poi descrivere soluzioni e casi di studio esistenti. Nella seconda parte dell'elaborato invece, verranno presentate le tecnologie utilizzate per il design e lo sviluppo di un framework, a supporto delle process-aware Web applications.
Resumo:
This dissertation document deals with the development of a project, over a span of more than two years, carried out within the scope of the Arrowhead Framework and which bears my personal contribution in several sections. The final part of the project took place during a visiting period at the university of Luleå. The Arrowhead Project is an European project, belonging to the ARTEMIS association, which aims to foster new technologies and unify the access to them into an unique framework. Such technologies include the Internet of Things phe- nomenon, Smart Houses, Electrical Mobility and renewable energy production. An application is considered compliant with such framework when it respects the Service Oriented Architecture paradigm and it is able to interact with a set of defined components called Arrowhead Core Services. My personal contribution to this project is given by the development of several user-friendly API, published in the project's main repository, and the integration of a legacy system within the Arrowhead Framework. The implementation of this legacy system was initiated by me in 2012 and, after many improvements carried out by several developers in UniBO, it has been again significantly modified this year in order to achieve compatibility. The system consists of a simulation of an urban scenario where a certain amount of electrical vehicles are traveling along their specified routes. The vehicles are con-suming their battery and, thus, need to recharge at the charging stations. The electrical vehicles need to use a reservation mechanism to be able to recharge and avoid waiting lines, due to the long recharge process. The integration with the above mentioned framework consists in the publication of the services that the system provides to the end users through the instantiation of several Arrowhead Service Producers, together with a demo Arrowhead- compliant client application able to consume such services.
Resumo:
Cities are key locations where Sustainability needs to be addressed at all levels, as land is a finite resource. However, not all urban spaces are exploited at best, and land developers often evaluate unused, misused, or poorly-designed urban portions as impracticable constraints. Further, public authorities lose the challenge to enable and turn these urban spaces into valuable opportunities where Sustainable Urban Development may flourish. Arguing that these spatial elements are at the centre of SUD, the paper elaborates a prototype in the form of a conceptual strategic planning framework, committed to an effective recycling of the city spaces using a flexible and multidisciplinary approach. Firstly, the research focuses upon a broad review of Sustainability literature, highlighting established principles and guidelines, building a sound theoretical base for the new concept. Hence, it investigates origins, identifies and congruently suggests a definition, characterisation and classification for urban “R-Spaces”. Secondly, formal, informal and temporary fitting functions are analysed and inserted into a portfolio meant to enhance adaptability and enlarge the choices for the on-site interventions. Thirdly, the study outlines ideal quality requirements for a sustainable planning process. Then, findings are condensed in the proposal, which is articulated in the individuation of tools, actors, plans, processes and strategies. Afterwards, the prototype is tested upon case studies: Solar Community (Casalecchio di Reno, Bologna) and Hyllie Sustainable City Project, the latter developed via an international workshop (ACSI-Camp, Malmö, Sweden). Besides, the qualitative results suggest, inter alia, the need to right-size spatial interventions, separate structural and operative actors, involve synergies’ multipliers and intermediaries (e.g. entrepreneurial HUBs, innovation agencies, cluster organisations…), maintain stakeholders’ diversity and create a circular process open for new participants. Finally, the paper speculates upon a transfer of the Swedish case study to Italy, and then indicates desirable future researches to favour the prototype implementation.
Resumo:
One of the main process features under study in Cognitive Translation & Interpreting Studies (CTIS) is the chronological unfolding of the tasks. The analyses of time spans in translation have been conceived in two ways: (1) studying those falling between text units of different sizes: words, phrases, sentences, and paragraphs; (2) setting arbitrary time span thresholds to explore where do they fall in the text, whether between text units or not. Writing disfluencies may lead to comprehensive insights into the cognitive activities involved in typing while translating. Indeed, long time spans are often taken as hints that cognitive resources have been subtracted from typing and devoted to other activities, such as planning, evaluating, etc. This exploratory, pilot study combined both approaches to seek potential general tendencies and contrasts in informants’ inferred mental processes when performing different writing tasks, through the analysis of their behaviors, as keylogged. The study tasks were retyping, monolingual free writing, translation, revision and a multimodal task—namely, monolingual text production based on an infographic leaflet. Task logs were chunked, and shorter time spans, including those within words, were analyzed following the Task Segment Framework (Muñoz & Apfelthaler, in press). Finally, time span analysis was combined with the analysis of the texts as to their lexical density, type-token ratio and word frequency. Several previous results were confirmed, and some others were surprising. Time spans in free writing were longer between paragraphs and sentences, possibly hinting at planning and, in translation, between clauses and words, suggesting more cognitive activities at these levels. On the other hand, the infographic was expected to facilitate the writing process, but most time spans were longer than in both free writing and translation. Results of the multimodal task and some other results suggest venues for further research.
Resumo:
The scientific success of the LHC experiments at CERN highly depends on the availability of computing resources which efficiently store, process, and analyse the amount of data collected every year. This is ensured by the Worldwide LHC Computing Grid infrastructure that connect computing centres distributed all over the world with high performance network. LHC has an ambitious experimental program for the coming years, which includes large investments and improvements both for the hardware of the detectors and for the software and computing systems, in order to deal with the huge increase in the event rate expected from the High Luminosity LHC (HL-LHC) phase and consequently with the huge amount of data that will be produced. Since few years the role of Artificial Intelligence has become relevant in the High Energy Physics (HEP) world. Machine Learning (ML) and Deep Learning algorithms have been successfully used in many areas of HEP, like online and offline reconstruction programs, detector simulation, object reconstruction, identification, Monte Carlo generation, and surely they will be crucial in the HL-LHC phase. This thesis aims at contributing to a CMS R&D project, regarding a ML "as a Service" solution for HEP needs (MLaaS4HEP). It consists in a data-service able to perform an entire ML pipeline (in terms of reading data, processing data, training ML models, serving predictions) in a completely model-agnostic fashion, directly using ROOT files of arbitrary size from local or distributed data sources. This framework has been updated adding new features in the data preprocessing phase, allowing more flexibility to the user. Since the MLaaS4HEP framework is experiment agnostic, the ATLAS Higgs Boson ML challenge has been chosen as physics use case, with the aim to test MLaaS4HEP and the contribution done with this work.