900 resultados para Stochastic agent-based models
Resumo:
Wydział Biologii
Resumo:
Esta dissertação incide sobre o tema da coordenação entre sistemas eólicos e fotovoltaicos que participam no mercado de eletricidade. A incerteza da potência eólica e fotovoltaica é uma caraterística predominante nesta coordenação, devendo ser considerada no planeamento ótimo de sistemas eólico-fotovoltaicos. A fim de modelizar a incerteza é apresentada uma metodologia de otimização estocástica baseada em programação linear para maximizar o lucro esperado de uma empresa produtora de energia elétrica que participa no mercado diário. A coordenação entre sistemas eólicos e fotovoltaicos visa mitigar os desequilíbrios de energia, resultantes das ofertas horárias submetidas no mercado diário e, consequentemente, reduzir as penalizações financeiras. Os resultados da coordenação entre um sistema eólico e um sistema fotovoltaico são comparados com os resultados obtidos para a operação não coordenada. Estes resultados permitem concluir que a metodologia desenvolvida aplicada à coordenação apresenta um lucro esperado superior ao lucro obtido para a operação não coordenada; Abstract Stochastic Optimization Methodology for Wind-Photovoltaic Coordination This dissertation focuses on the issue of coordination between wind and photovoltaic systems participating in electricity markets. The uncertainty of wind and photovoltaic power is a main characteristic of these systems, which must be included in the optimal scheduling of the coordination of wind with photovoltaic systems. In order to model the uncertainty is presented a stochastic approach based on linear programming to maximize the profit of a wind photovoltaic power producer which participates in electricity markets. The coordination of wind with photovoltaic systems aims to mitigate the energy deviations, as a result of the participation in day-ahead market and therefore reducing economic penalties. The results obtained by the coordination are compared to results obtained by the separated operation of wind and photovoltaic systems. The results allow concluding that the proposed approach applied to the coordination presents an expected profit higher than the expected profit without coordination.
Resumo:
Intelligent agents offer a new and exciting way of understanding the world of work. In this paper we apply agent-based modeling and simulation to investigate a set of problems in a retail context. Specifically, we are working to understand the relationship between human resource management practices and retail productivity. Despite the fact we are working within a relatively novel and complex domain, it is clear that intelligent agents could offer potential for fostering sustainable organizational capabilities in the future. The project is still at an early stage. So far we have conducted a case study in a UK department store to collect data and capture impressions about operations and actors within departments. Furthermore, based on our case study we have built and tested our first version of a retail branch simulator which we will present in this paper.
Resumo:
Intelligent agents offer a new and exciting way of understanding the world of work. In this paper we apply agent-based modeling and simulation to investigate a set of problems in a retail context. Specifically, we are working to understand the relationship between human resource management practices and retail productivity. Despite the fact we are working within a relatively novel and complex domain, it is clear that intelligent agents could offer potential for fostering sustainable organizational capabilities in the future. Our research so far has led us to conduct case study work with a top ten UK retailer, collecting data in four departments in two stores. Based on our case study data we have built and tested a first version of a department store simulator. In this paper we will report on the current development of our simulator which includes new features concerning more realistic data on the pattern of footfall during the day and the week, a more differentiated view of customers, and the evolution of customers over time. This allows us to investigate more complex scenarios and to analyze the impact of various management practices.
Resumo:
Computational intelligent support for decision making is becoming increasingly popular and essential among medical professionals. Also, with the modern medical devices being capable to communicate with ICT, created models can easily find practical translation into software. Machine learning solutions for medicine range from the robust but opaque paradigms of support vector machines and neural networks to the also performant, yet more comprehensible, decision trees and rule-based models. So how can such different techniques be combined such that the professional obtains the whole spectrum of their particular advantages? The presented approaches have been conceived for various medical problems, while permanently bearing in mind the balance between good accuracy and understandable interpretation of the decision in order to truly establish a trustworthy ‘artificial’ second opinion for the medical expert.
Resumo:
Multi-agent systems offer a new and exciting way of understanding the world of work. We apply agent-based modeling and simulation to investigate a set of problems in a retail context. Specifically, we are working to understand the relationship between people management practices on the shop-floor and retail performance. Despite the fact we are working within a relatively novel and complex domain, it is clear that using an agent-based approach offers great potential for improving organizational capabilities in the future. Our multi-disciplinary research team has worked closely with one of the UK’s top ten retailers to collect data and build an understanding of shop-floor operations and the key actors in a department (customers, staff, and managers). Based on this case study we have built and tested our first version of a retail branch agent-based simulation model where we have focused on how we can simulate the effects of people management practices on customer satisfaction and sales. In our experiments we have looked at employee development and cashier empowerment as two examples of shop floor management practices. In this paper we describe the underlying conceptual ideas and the features of our simulation model. We present a selection of experiments we have conducted in order to validate our simulation model and to show its potential for answering “what-if” questions in a retail context. We also introduce a novel performance measure which we have created to quantify customers’ satisfaction with service, based on their individual shopping experiences.
Resumo:
Decades of costly failures in translating drug candidates from preclinical disease models to human therapeutic use warrant reconsideration of the priority placed on animal models in biomedical research. Following an international workshop attended by experts from academia, government institutions, research funding bodies, and the corporate and nongovernmental organisation (NGO) sectors, in this consensus report, we analyse, as case studies, five disease areas with major unmet needs for new treatments. In view of the scientifically driven transition towards a human pathway-based paradigm in toxicology, a similar paradigm shift appears to be justified in biomedical research. There is a pressing need for an approach that strategically implements advanced, human biology-based models and tools to understand disease pathways at multiple biological scales. We present recommendations to help achieve this.
Resumo:
A differenza di quanto avviene nel commercio tradizionale, in quello online il cliente non ha la possibilità di toccare con mano o provare il prodotto. La decisione di acquisto viene maturata in base ai dati messi a disposizione dal venditore attraverso titolo, descrizioni, immagini e alle recensioni di clienti precedenti. É quindi possibile prevedere quanto un prodotto venderà sulla base di queste informazioni. La maggior parte delle soluzioni attualmente presenti in letteratura effettua previsioni basandosi sulle recensioni, oppure analizzando il linguaggio usato nelle descrizioni per capire come questo influenzi le vendite. Le recensioni, tuttavia, non sono informazioni note ai venditori prima della commercializzazione del prodotto; usando solo dati testuali, inoltre, si tralascia l’influenza delle immagini. L'obiettivo di questa tesi è usare modelli di machine learning per prevedere il successo di vendita di un prodotto a partire dalle informazioni disponibili al venditore prima della commercializzazione. Si fa questo introducendo un modello cross-modale basato su Vision-Language Transformer in grado di effettuare classificazione. Un modello di questo tipo può aiutare i venditori a massimizzare il successo di vendita dei prodotti. A causa della mancanza, in letteratura, di dataset contenenti informazioni relative a prodotti venduti online che includono l’indicazione del successo di vendita, il lavoro svolto comprende la realizzazione di un dataset adatto a testare la soluzione sviluppata. Il dataset contiene un elenco di 78300 prodotti di Moda venduti su Amazon, per ognuno dei quali vengono riportate le principali informazioni messe a disposizione dal venditore e una misura di successo sul mercato. Questa viene ricavata a partire dal gradimento espresso dagli acquirenti e dal posizionamento del prodotto in una graduatoria basata sul numero di esemplari venduti.
Resumo:
In the last few years, mobile wireless technology has gone through a revolutionary change. Web-enabled devices have evolved into essential tools for communication, information, and entertainment. The fifth generation (5G) of mobile communication networks is envisioned to be a key enabler of the next upcoming wireless revolution. Millimeter wave (mmWave) spectrum and the evolution of Cloud Radio Access Networks (C-RANs) are two of the main technological innovations of 5G wireless systems and beyond. Because of the current spectrum-shortage condition, mmWaves have been proposed for the next generation systems, providing larger bandwidths and higher data rates. Consequently, new radio channel models are being developed. Recently, deterministic ray-based models such as Ray-Tracing (RT) are getting more attractive thanks to their frequency-agility and reliable predictions. A modern RT software has been calibrated and used to analyze the mmWave channel. Knowledge of the electromagnetic properties of materials is therefore essential. Hence, an item-level electromagnetic characterization of common construction materials has been successfully achieved to obtain information about their complex relative permittivity. A complete tuning of the RT tool has been performed against indoor and outdoor measurement campaigns at 27 and 38 GHz, setting the basis for the future development of advanced beamforming techniques which rely on deterministic propagation models (as RT). C-RAN is a novel mobile network architecture which can address a number of challenges that network operators are facing in order to meet the continuous customers’ demands. C-RANs have already been adopted in advanced 4G deployments; however, there are still some issues to deal with, especially considering the bandwidth requirements set by the forthcoming 5G systems. Open RAN specifications have been proposed to overcome the new 5G challenges set on C-RAN architectures, including synchronization aspects. In this work it is described an FPGA implementation of the Synchronization Plane for an O-RAN-compliant radio system.
Resumo:
With the entry into force of the latest Italian Building Code (NTC 2008, 2018), innovative criteria were provided, especially for what concerns the seismic verifications of large infrastructures. In particular, for buildings considered as strategic, such as large dams, a seismotectonic study of the site was declared necessary, which involves a re-assessment of the basic seismic hazard. This PhD project fits into this context, being part of the seismic re-evaluation process of large dams launched on a national scale following the O.P.C.M. 3274/2003, D.L. 79/2004. A full seismotectonic study in the region of two large earth dams in Southern Italy was carried out. We identified and characterized the structures that could generate earthquakes in our study area, together with the definition of the local seismic history. This information was used for the reassessment of the basic seismic hazard, using probabilistic seismic hazard assessment approaches. In recent years, fault-based models for the seismic hazard assessment have been proposed all over the world as a new emerging methodology. For this reason, we decided to test the innovative SHERIFS approach on our study area. The occasion of the seismotectonic study gave also the opportunity to focus on the characteristics of the seismic stations that provided the data for the study itself. In the context of the work presented here, we focused on the 10 stations that had been active for the longest time and we carried out a geophysical characterization, the data of which merged into a more general study on the soil-structure interaction at seismic stations and on the ways in which it could affect the SHA. Lastly, an additional experimental study on the two dams and their associated minor structures is also presented, aimed at defining their main dynamic parameters, useful for subsequent dynamic structural and geotechnical studies.
Resumo:
Earthquake prediction is a complex task for scientists due to the rare occurrence of high-intensity earthquakes and their inaccessible depths. Despite this challenge, it is a priority to protect infrastructure, and populations living in areas of high seismic risk. Reliable forecasting requires comprehensive knowledge of seismic phenomena. In this thesis, the development, application, and comparison of both deterministic and probabilistic forecasting methods is shown. Regarding the deterministic approach, the implementation of an alarm-based method using the occurrence of strong (fore)shocks, widely felt by the population, as a precursor signal is described. This model is then applied for retrospective prediction of Italian earthquakes of magnitude M≥5.0,5.5,6.0, occurred in Italy from 1960 to 2020. Retrospective performance testing is carried out using tests and statistics specific to deterministic alarm-based models. Regarding probabilistic models, this thesis focuses mainly on the EEPAS and ETAS models. Although the EEPAS model has been previously applied and tested in some regions of the world, it has never been used for forecasting Italian earthquakes. In the thesis, the EEPAS model is used to retrospectively forecast Italian shallow earthquakes with a magnitude of M≥5.0 using new MATLAB software. The forecasting performance of the probabilistic models was compared to other models using CSEP binary tests. The EEPAS and ETAS models showed different characteristics for forecasting Italian earthquakes, with EEPAS performing better in the long-term and ETAS performing better in the short-term. The FORE model based on strong precursor quakes is compared to EEPAS and ETAS using an alarm-based deterministic approach. All models perform better than a random forecasting model, with ETAS and FORE models showing better performance. However, to fully evaluate forecasting performance, prospective tests should be conducted. The lack of objective tests for evaluating deterministic models and comparing them with probabilistic ones was a challenge faced during the study.
Resumo:
In this thesis, the viability of the Dynamic Mode Decomposition (DMD) as a technique to analyze and model complex dynamic real-world systems is presented. This method derives, directly from data, computationally efficient reduced-order models (ROMs) which can replace too onerous or unavailable high-fidelity physics-based models. Optimizations and extensions to the standard implementation of the methodology are proposed, investigating diverse case studies related to the decoding of complex flow phenomena. The flexibility of this data-driven technique allows its application to high-fidelity fluid dynamics simulations, as well as time series of real systems observations. The resulting ROMs are tested against two tasks: (i) reduction of the storage requirements of high-fidelity simulations or observations; (ii) interpolation and extrapolation of missing data. The capabilities of DMD can also be exploited to alleviate the cost of onerous studies that require many simulations, such as uncertainty quantification analysis, especially when dealing with complex high-dimensional systems. In this context, a novel approach to address parameter variability issues when modeling systems with space and time-variant response is proposed. Specifically, DMD is merged with another model-reduction technique, namely the Polynomial Chaos Expansion, for uncertainty quantification purposes. Useful guidelines for DMD deployment result from the study, together with the demonstration of its potential to ease diagnosis and scenario analysis when complex flow processes are involved.
Resumo:
Il crescente numero di attacchi condotti contro sistemi e servizi informatici richiede nuove strategie per la cybersicurezza. In questa tesi si prende in considerazione uno degli approcci più moderni per questa attività, basato su architetture Zero Trust, che deperimetrizzano i sistemi e mirano a verificare ogni tentativo di accesso alle risorse indipendentemente dalla provenienza locale o remota della richiesta. In tale ambito, la tesi propone una nuova forma di microsegmentazione agent-based basata su overlay network, con l'obiettivo di migliorare la scalabilità e la robustezza delle soluzioni esistenti, ad oggi messe in secondo piano in favore della facilità di configurazione. Una consistente serie di test dimostra che l'approccio descritto, attuabile in molteplici tipologie di sistemi cloud, è in grado di garantire, oltre alla sicurezza, scalabilità al crescere dei nodi partecipanti, robustezza evitando punti unici di fallimento e semplicità di configurazione.
Resumo:
Lo scopo della ricerca è quello di sviluppare un metodo di design che integri gli apporti delle diverse discipline di architettura, ingegneria e fabbricazione all’interno del progetto, utilizzando come caso di studio l’uso di una tettonica ad elementi planari in legno per la costruzione di superfici a guscio da utilizzare come padiglioni temporanei. La maniera in cui ci si propone di raggiungere tale scopo è tramite l’utilizzo di un agent based system che funge da mediatore tra i vari obbiettivi che si vogliono considerare, in questo caso tra parametri estetici, legati alla geometria scelta, e di fabbricazione. Si sceglie di applicare questo sistema allo studio di una struttura a guscio, che grazie alla sua naturale rigidezza integra forma e capacità strutturale, tramite una tassellazione planare della superficie stessa. Il sistema studiato si basa sull’algoritmo di circle relaxation, che viene integrato tramite dei comportamenti che tengano conto della curvatura della superficie in questione e altri comportamenti scelti appositamente per agevolare il processo di tassellazione tramite tangent plane intersection. La scelta di studiare elementi planari è finalizzata ad una maggiore facilità di fabbricazione ed assemblaggio prevedendo l’uso di macchine a controllo numerico per la fabbricazione e un assemblaggio interamente a secco e che non necessita di impalcature . Il risultato proposto è quello quindi di un padiglione costituito da elementi planari ricomponibili in legno, con particolare attenzione alla facilità e velocità di montaggio degli stessi, utile per possibili strutture temporanee e/o di emergenza.
Resumo:
In this article we use factor models to describe a certain class of covariance structure for financiaI time series models. More specifical1y, we concentrate on situations where the factor variances are modeled by a multivariate stochastic volatility structure. We build on previous work by allowing the factor loadings, in the factor mo deI structure, to have a time-varying structure and to capture changes in asset weights over time motivated by applications with multi pIe time series of daily exchange rates. We explore and discuss potential extensions to the models exposed here in the prediction area. This discussion leads to open issues on real time implementation and natural model comparisons.