503 resultados para algorithmic skeletons
Resumo:
Le conseguenze del management algoritmico sui lavoratori sono note tra gli studiosi, ma poche ricerche indagano le possibilità di agency, soprattutto a livello individuale, nella gig-economy. A partire dalla quotidianità del lavoro, l’obiettivo è analizzare le forme di agency esercitate dai platform workers nel settore della logistica dell'ultimo miglio. La ricerca si basa su un'etnografia multi-situata condotta in due paesi molto distanti e riguardante due diversi servizi urbani di piattaforma: il food-delivery in Italia (Bologna, Torino) e il ride-hailing in Argentina (Buenos Aires). Nonostante le differenze, il lavoro di campo ha mostrato diverse continuità tra i contesti geografici. Innanzitutto, le tecnologie digitali giocano un ruolo ambivalente nell'ambiente di lavoro: se la tecnologia è usata dalle aziende per disciplinare il lavoro, costituisce però anche uno strumento che può essere impiegato a vantaggio dei lavoratori. Sia nel ride-hailing che nelle piattaforme di food-delivery, infatti, i lavoratori esprimono la loro agency condividendo pratiche di rimaneggiamento e tattiche per aggirare il despotismo algoritmico. In secondo luogo, la ricerca ha portato alla luce una gran varietà di attività economiche sviluppate ai margini dell'economia di piattaforma. In entrambi i casi le piattaforme intersecano vivacemente le economie informali urbane e alimentano circuiti informali di lavoro, come evidenziato dall'elevata presenza di scambi illeciti: ad esempio, vendita di account, hacking-bots, caporalato digitale. Tutt'altro che avviare un processo di formalizzazione, quindi, la piattaforma sussume e riproduce l’insieme di condizioni produttive e riproduttive dell'informalità (viração), offrendo impieghi intermittenti e insicuri a una massa di lavoratori-usa-e-getta disponibile al sottoimpiego. In conclusione, le piattaforme vengono definite come infrastrutture barocche, intendendo con il barocco tanto la natura ibrida dell'azione che mescola forme di neoliberismo-dal-basso con pratiche di solidarietà tra pari, quanto la progressiva ristrutturazione dei processi di accumulazione all’insegna di una rinnovata interdipendenza tra formale e informale nelle infrastrutture del «mondo a domicilio».
Resumo:
One of the most visionary goals of Artificial Intelligence is to create a system able to mimic and eventually surpass the intelligence observed in biological systems including, ambitiously, the one observed in humans. The main distinctive strength of humans is their ability to build a deep understanding of the world by learning continuously and drawing from their experiences. This ability, which is found in various degrees in all intelligent biological beings, allows them to adapt and properly react to changes by incrementally expanding and refining their knowledge. Arguably, achieving this ability is one of the main goals of Artificial Intelligence and a cornerstone towards the creation of intelligent artificial agents. Modern Deep Learning approaches allowed researchers and industries to achieve great advancements towards the resolution of many long-standing problems in areas like Computer Vision and Natural Language Processing. However, while this current age of renewed interest in AI allowed for the creation of extremely useful applications, a concerningly limited effort is being directed towards the design of systems able to learn continuously. The biggest problem that hinders an AI system from learning incrementally is the catastrophic forgetting phenomenon. This phenomenon, which was discovered in the 90s, naturally occurs in Deep Learning architectures where classic learning paradigms are applied when learning incrementally from a stream of experiences. This dissertation revolves around the Continual Learning field, a sub-field of Machine Learning research that has recently made a comeback following the renewed interest in Deep Learning approaches. This work will focus on a comprehensive view of continual learning by considering algorithmic, benchmarking, and applicative aspects of this field. This dissertation will also touch on community aspects such as the design and creation of research tools aimed at supporting Continual Learning research, and the theoretical and practical aspects concerning public competitions in this field.
Resumo:
Marine biomineralizing organisms provide a fundamental link between biology and environment. Calcified structure are important archives that can provide us main means of understanding organism adaptation, habits, environmental characteristics, and to look back in time and explore the past climate and their evolutionary history. In fact, biomineralized structures retain an unparalleled record of current and past ocean conditions through the investigation of their microchemistry and isotopes. This thesis considers aspects of two different biomineralization systems: fish otolith and coral skeletons at macro-, micro- and nanoscale, with the aim to understand how their morphology, structural characteristics and compositions can provide information of their functionality, and the environmental, behavioural, and evolutionary context in which organisms are framed. To this end, I applied a multidisciplinary approach in the scope to investigate calcified structures as “information recorders” and as models to study the phenotypic plasticity.
Resumo:
Over the last century, mathematical optimization has become a prominent tool for decision making. Its systematic application in practical fields such as economics, logistics or defense led to the development of algorithmic methods with ever increasing efficiency. Indeed, for a variety of real-world problems, finding an optimal decision among a set of (implicitly or explicitly) predefined alternatives has become conceivable in reasonable time. In the last decades, however, the research community raised more and more attention to the role of uncertainty in the optimization process. In particular, one may question the notion of optimality, and even feasibility, when studying decision problems with unknown or imprecise input parameters. This concern is even more critical in a world becoming more and more complex —by which we intend, interconnected —where each individual variation inside a system inevitably causes other variations in the system itself. In this dissertation, we study a class of optimization problems which suffer from imprecise input data and feature a two-stage decision process, i.e., where decisions are made in a sequential order —called stages —and where unknown parameters are revealed throughout the stages. The applications of such problems are plethora in practical fields such as, e.g., facility location problems with uncertain demands, transportation problems with uncertain costs or scheduling under uncertain processing times. The uncertainty is dealt with a robust optimization (RO) viewpoint (also known as "worst-case perspective") and we present original contributions to the RO literature on both the theoretical and practical side.
Resumo:
Water Distribution Networks (WDNs) play a vital importance rule in communities, ensuring well-being band supporting economic growth and productivity. The need for greater investment requires design choices will impact on the efficiency of management in the coming decades. This thesis proposes an algorithmic approach to address two related problems:(i) identify the fundamental asset of large WDNs in terms of main infrastructure;(ii) sectorize large WDNs into isolated sectors in order to respect the minimum service to be guaranteed to users. Two methodologies have been developed to meet these objectives and subsequently they were integrated to guarantee an overall process which allows to optimize the sectorized configuration of WDN taking into account the needs to integrated in a global vision the two problems (i) and (ii). With regards to the problem (i), the methodology developed introduces the concept of primary network to give an answer with a dual approach, of connecting main nodes of WDN in terms of hydraulic infrastructures (reservoirs, tanks, pumps stations) and identifying hypothetical paths with the minimal energy losses. This primary network thus identified can be used as an initial basis to design the sectors. The sectorization problem (ii) has been faced using optimization techniques by the development of a new dedicated Tabu Search algorithm able to deal with real case studies of WDNs. For this reason, three new large WDNs models have been developed in order to test the capabilities of the algorithm on different and complex real cases. The developed methodology also allows to automatically identify the deficient parts of the primary network and dynamically includes new edges in order to support a sectorized configuration of the WDN. The application of the overall algorithm to the new real case studies and to others from literature has given applicable solutions even in specific complex situations.
Resumo:
Riding the wave of recent groundbreaking achievements, artificial intelligence (AI) is currently the buzzword on everybody’s lips and, allowing algorithms to learn from historical data, Machine Learning (ML) emerged as its pinnacle. The multitude of algorithms, each with unique strengths and weaknesses, highlights the absence of a universal solution and poses a challenging optimization problem. In response, automated machine learning (AutoML) navigates vast search spaces within minimal time constraints. By lowering entry barriers, AutoML emerged as promising the democratization of AI, yet facing some challenges. In data-centric AI, the discipline of systematically engineering data used to build an AI system, the challenge of configuring data pipelines is rather simple. We devise a methodology for building effective data pre-processing pipelines in supervised learning as well as a data-centric AutoML solution for unsupervised learning. In human-centric AI, many current AutoML tools were not built around the user but rather around algorithmic ideas, raising ethical and social bias concerns. We contribute by deploying AutoML tools aiming at complementing, instead of replacing, human intelligence. In particular, we provide solutions for single-objective and multi-objective optimization and showcase the challenges and potential of novel interfaces featuring large language models. Finally, there are application areas that rely on numerical simulators, often related to earth observations, they tend to be particularly high-impact and address important challenges such as climate change and crop life cycles. We commit to coupling these physical simulators with (Auto)ML solutions towards a physics-aware AI. Specifically, in precision farming, we design a smart irrigation platform that: allows real-time monitoring of soil moisture, predicts future moisture values, and estimates water demand to schedule the irrigation.
Resumo:
Nella letteratura economica e di teoria dei giochi vi è un dibattito aperto sulla possibilità di emergenza di comportamenti anticompetitivi da parte di algoritmi di determinazione automatica dei prezzi di mercato. L'obiettivo di questa tesi è sviluppare un modello di reinforcement learning di tipo actor-critic con entropy regularization per impostare i prezzi in un gioco dinamico di competizione oligopolistica con prezzi continui. Il modello che propongo esibisce in modo coerente comportamenti cooperativi supportati da meccanismi di punizione che scoraggiano la deviazione dall'equilibrio raggiunto a convergenza. Il comportamento di questo modello durante l'apprendimento e a convergenza avvenuta aiuta inoltre a interpretare le azioni compiute da Q-learning tabellare e altri algoritmi di prezzo in condizioni simili. I risultati sono robusti alla variazione del numero di agenti in competizione e al tipo di deviazione dall'equilibrio ottenuto a convergenza, punendo anche deviazioni a prezzi più alti.
Resumo:
When it comes to designing a structure, architects and engineers want to join forces in order to create and build the most beautiful and efficient building. From finding new shapes and forms to optimizing the stability and the resistance, there is a constant link to be made between both professions. In architecture, there has always been a particular interest in creating new shapes and types of a structure inspired by many different fields, one of them being nature itself. In engineering, the selection of optimum has always dictated the way of thinking and designing structures. This mindset led through studies to the current best practices in construction. However, both disciplines were limited by the traditional manufacturing constraints at a certain point. Over the last decades, much progress was made from a technological point of view, allowing to go beyond today's manufacturing constraints. With the emergence of Wire-and-Arc Additive Manufacturing (WAAM) combined with Algorithmic-Aided Design (AAD), architects and engineers are offered new opportunities to merge architectural beauty and structural efficiency. Both technologies allow for exploring and building unusual and complex structural shapes in addition to a reduction of costs and environmental impacts. Through this study, the author wants to make use of previously mentioned technologies and assess their potential, first to design an aesthetically appreciated tree-like column with the idea of secondly proposing a new type of standardized and optimized sandwich cross-section to the construction industry. Parametric algorithms to model the dendriform column and the new sandwich cross-section are developed and presented in detail. A catalog draft of the latter and methods to establish it are then proposed and discussed. Finally, the buckling behavior of this latter is assessed considering standard steel and WAAM material properties.