22 resultados para data storage concept
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Early definitions of Smart Building focused almost entirely on the technology aspect and did not suggest user interaction at all. Indeed, today we would attribute it more to the concept of the automated building. In this sense, control of comfort conditions inside buildings is a problem that is being well investigated, since it has a direct effect on users’ productivity and an indirect effect on energy saving. Therefore, from the users’ perspective, a typical environment can be considered comfortable, if it’s capable of providing adequate thermal comfort, visual comfort and indoor air quality conditions and acoustic comfort. In the last years, the scientific community has dealt with many challenges, especially from a technological point of view. For instance, smart sensing devices, the internet, and communication technologies have enabled a new paradigm called Edge computing that brings computation and data storage closer to the location where it is needed, to improve response times and save bandwidth. This has allowed us to improve services, sustainability and decision making. Many solutions have been implemented such as smart classrooms, controlling the thermal condition of the building, monitoring HVAC data for energy-efficient of the campus and so forth. Though these projects provide to the realization of smart campus, a framework for smart campus is yet to be determined. These new technologies have also introduced new research challenges: within this thesis work, some of the principal open challenges will be faced, proposing a new conceptual framework, technologies and tools to move forward the actual implementation of smart campuses. Keeping in mind, several problems known in the literature have been investigated: the occupancy detection, noise monitoring for acoustic comfort, context awareness inside the building, wayfinding indoor, strategic deployment for air quality and books preserving.
Resumo:
In the last year [1], Angiolini and co-workers have synthesized and investigated methacrylic polymers bearing in the side chain the chiral cyclic (S)-3-hydroxypyrrolidine moiety interposed between the main chain and the trans-azoaromatic chromophore, substituted or not in the 4’ position by an electron-withdrawing group. In these materials, the presence of a rigid chiral moiety of one prevailing absolute configuration favours the establishment of a chiral conformation of one prevailing helical handedness, at least within chain segments of the macromolecules, which can be observed by circular dichroism (CD). The simultaneous presence of the azoaromatic and chiral functionalities allows the polymers to display both the properties typical of dissymmetric systems (optical activity, exciton splitting of dichroic absorptions), as well as the features typical of photochromic materials (photorefractivity, photoresponsiveness, NLO properties). The first part of this research was to synthesize analogue homopolymers and copolymers based on bisazoaromatic moiety and compare their properties to those of the above mentioned analogue derivatives bearing only one azoaromatic chromophore in the side chain. We focused also the attention on the effects induced on the thermal and chiroptical behaviours by the insertion of particulars achiral comonomers characterized by different side-chain mobility and grown hindrance (MMA, tert-BMA and TrMA). On the other hand carbazole containing polymers [2] have attracted much attention because of their unique features. The use of these materials in advanced micro- and nanotechnologies spreads in many different applications such as photoconductive and photorefractive polymers, electroluminescent devices, programmable optical interconnections, data storage, chemical photoreceptors, NLO, surface relief gratings, blue emitting materials and holographic memory. The second part of the work was focused on the synthesis and the characterization polymeric derivatives bearing in the side chain carbazole or phenylcarbazole moieties linked to the (S)- 2-hydroxy succinimide or the (S)-3-hydroxy pyrrolidinyl ring as chiral groups covalently linked to the main chain through ester bonds. The last objective of this research was to design, synthesize, and characterize multifunctional methacrylic homopolymers and copolymers bearing three distinct functional groups (i.e. azoaromatic, carbazole and chiral group of one single configuration) directly linked in the side chain. This polymeric derivatives could be of potential interest for several advanced application fields, such as optical storage, waveguides, chiroptical switches, chemical photoreceptors, NLO, surface relief gratings, photoconductive materials, etc.
Resumo:
This thesis deals with Context Aware Services, Smart Environments, Context Management and solutions for Devices and Service Interoperability. Multi-vendor devices offer an increasing number of services and end-user applications that base their value on the ability to exploit the information originating from the surrounding environment by means of an increasing number of embedded sensors, e.g. GPS, compass, RFID readers, cameras and so on. However, usually such devices are not able to exchange information because of the lack of a shared data storage and common information exchange methods. A large number of standards and domain specific building blocks are available and are heavily used in today's products. However, the use of these solutions based on ready-to-use modules is not without problems. The integration and cooperation of different kinds of modules can be daunting because of growing complexity and dependency. In this scenarios it might be interesting to have an infrastructure that makes the coexistence of multi-vendor devices easy, while enabling low cost development and smooth access to services. This sort of technologies glue should reduce both software and hardware integration costs by removing the trouble of interoperability. The result should also lead to faster and simplified design, development and, deployment of cross-domain applications. This thesis is mainly focused on SW architectures supporting context aware service providers especially on the following subjects: - user preferences service adaptation - context management - content management - information interoperability - multivendor device interoperability - communication and connectivity interoperability Experimental activities were carried out in several domains including Cultural Heritage, indoor and personal smart spaces – all of which are considered significant test-beds in Context Aware Computing. The work evolved within european and national projects: on the europen side, I carried out my research activity within EPOCH, the FP6 Network of Excellence on “Processing Open Cultural Heritage” and within SOFIA, a project of the ARTEMIS JU on embedded systems. I worked in cooperation with several international establishments, including the University of Kent, VTT (the Technical Reserarch Center of Finland) and Eurotech. On the national side I contributed to a one-to-one research contract between ARCES and Telecom Italia. The first part of the thesis is focused on problem statement and related work and addresses interoperability issues and related architecture components. The second part is focused on specific architectures and frameworks: - MobiComp: a context management framework that I used in cultural heritage applications - CAB: a context, preference and profile based application broker which I designed within EPOCH Network of Excellence - M3: "Semantic Web based" information sharing infrastructure for smart spaces designed by Nokia within the European project SOFIA - NoTa: a service and transport independent connectivity framework - OSGi: the well known Java based service support framework The final section is dedicated to the middleware, the tools and, the SW agents developed during my Doctorate time to support context-aware services in smart environments.
Resumo:
The continuous advancements and enhancements of wireless systems are enabling new compelling scenarios where mobile services can adapt according to the current execution context, represented by the computational resources available at the local device, current physical location, people in physical proximity, and so forth. Such services called context-aware require the timely delivery of all relevant information describing the current context, and that introduces several unsolved complexities, spanning from low-level context data transmission up to context data storage and replication into the mobile system. In addition, to ensure correct and scalable context provisioning, it is crucial to integrate and interoperate with different wireless technologies (WiFi, Bluetooth, etc.) and modes (infrastructure-based and ad-hoc), and to use decentralized solutions to store and replicate context data on mobile devices. These challenges call for novel middleware solutions, here called Context Data Distribution Infrastructures (CDDIs), capable of delivering relevant context data to mobile devices, while hiding all the issues introduced by data distribution in heterogeneous and large-scale mobile settings. This dissertation thoroughly analyzes CDDIs for mobile systems, with the main goal of achieving a holistic approach to the design of such type of middleware solutions. We discuss the main functions needed by context data distribution in large mobile systems, and we claim the precise definition and clean respect of quality-based contracts between context consumers and CDDI to reconfigure main middleware components at runtime. We present the design and the implementation of our proposals, both in simulation-based and in real-world scenarios, along with an extensive evaluation that confirms the technical soundness of proposed CDDI solutions. Finally, we consider three highly heterogeneous scenarios, namely disaster areas, smart campuses, and smart cities, to better remark the wide technical validity of our analysis and solutions under different network deployments and quality constraints.
Resumo:
Come risposta positiva alle richieste provenienti dal mondo dei giuristi, spesso troppo distante da quello scientifico, si vuole sviluppare un sistema solido dal punto di vista tecnico e chiaro dal punto di vista giurico finalizzato ad migliore ricerca della verità. L’obiettivo ci si prefigge è quello di creare uno strumento versatile e di facile utilizzo da mettere a disposizione dell’A.G. ed eventualmente della P.G. operante finalizzato a consentire il proseguo dell’attività d’indagine in tempi molto rapidi e con un notevole contenimento dei costi di giustizia rispetto ad una normale CTU. La progetto verterà su analisi informatiche forensi di supporti digitali inerenti vari tipi di procedimento per cui si dovrebbe richiedere una CTU o una perizia. La sperimentazione scientifica prevede un sistema di partecipazione diretta della P.G. e della A.G. all’analisi informatica rendendo disponibile, sottoforma di macchina virtuale, il contenuto dei supporti sequestrati in modo che possa essere visionato alla pari del supporto originale. In questo modo il CT diventa una mera guida per la PG e l’AG nell’ambito dell’indagine informatica forense che accompagna il giudice e le parti alla migliore comprensione delle informazioni richieste dal quesito. Le fasi chiave della sperimentazione sono: • la ripetibilità delle operazioni svolte • dettare delle chiare linee guida per la catena di custodia dalla presa in carico dei supporti • i metodi di conservazione e trasmissione dei dati tali da poter garantire integrità e riservatezza degli stessi • tempi e costi ridotti rispetto alle normali CTU/perizie • visualizzazione diretta dei contenuti dei supporti analizzati delle Parti e del Giudice circoscritte alle informazioni utili ai fini di giustizia
Resumo:
The miniaturization race in the hardware industry aiming at continuous increasing of transistor density on a die does not bring respective application performance improvements any more. One of the most promising alternatives is to exploit a heterogeneous nature of common applications in hardware. Supported by reconfigurable computation, which has already proved its efficiency in accelerating data intensive applications, this concept promises a breakthrough in contemporary technology development. Memory organization in such heterogeneous reconfigurable architectures becomes very critical. Two primary aspects introduce a sophisticated trade-off. On the one hand, a memory subsystem should provide well organized distributed data structure and guarantee the required data bandwidth. On the other hand, it should hide the heterogeneous hardware structure from the end-user, in order to support feasible high-level programmability of the system. This thesis work explores the heterogeneous reconfigurable hardware architectures and presents possible solutions to cope the problem of memory organization and data structure. By the example of the MORPHEUS heterogeneous platform, the discussion follows the complete design cycle, starting from decision making and justification, until hardware realization. Particular emphasis is made on the methods to support high system performance, meet application requirements, and provide a user-friendly programmer interface. As a result, the research introduces a complete heterogeneous platform enhanced with a hierarchical memory organization, which copes with its task by means of separating computation from communication, providing reconfigurable engines with computation and configuration data, and unification of heterogeneous computational devices using local storage buffers. It is distinguished from the related solutions by distributed data-flow organization, specifically engineered mechanisms to operate with data on local domains, particular communication infrastructure based on Network-on-Chip, and thorough methods to prevent computation and communication stalls. In addition, a novel advanced technique to accelerate memory access was developed and implemented.
Resumo:
Beside the traditional paradigm of "centralized" power generation, a new concept of "distributed" generation is emerging, in which the same user becomes pro-sumer. During this transition, the Energy Storage Systems (ESS) can provide multiple services and features, which are necessary for a higher quality of the electrical system and for the optimization of non-programmable Renewable Energy Source (RES) power plants. A ESS prototype was designed, developed and integrated into a renewable energy production system in order to create a smart microgrid and consequently manage in an efficient and intelligent way the energy flow as a function of the power demand. The produced energy can be introduced into the grid, supplied to the load directly or stored in batteries. The microgrid is composed by a 7 kW wind turbine (WT) and a 17 kW photovoltaic (PV) plant are part of. The load is given by electrical utilities of a cheese factory. The ESS is composed by the following two subsystems, a Battery Energy Storage System (BESS) and a Power Control System (PCS). With the aim of sizing the ESS, a Remote Grid Analyzer (RGA) was designed, realized and connected to the wind turbine, photovoltaic plant and the switchboard. Afterwards, different electrochemical storage technologies were studied, and taking into account the load requirements present in the cheese factory, the most suitable solution was identified in the high temperatures salt Na-NiCl2 battery technology. The data acquisition from all electrical utilities provided a detailed load analysis, indicating the optimal storage size equal to a 30 kW battery system. Moreover a container was designed and realized to locate the BESS and PCS, meeting all the requirements and safety conditions. Furthermore, a smart control system was implemented in order to handle the different applications of the ESS, such as peak shaving or load levelling.
Resumo:
Power-to-Gas storage systems have the potential to address grid-stability issues that arise when an increasing share of power is generated from sources that have a highly variable output. Although the proof-of-concept of these has been promising, the behaviour of the processes in off-design conditions is not easily predictable. The primary aim of this PhD project was to evaluate the performance of an original Power-to-Gas system, made up of innovative components. To achieve this, a numerical model has been developed to simulate the characteristics and the behaviour of the several components when the whole system is coupled with a renewable source. The developed model has been applied to a large variety of scenarios, evaluating the performance of the considered process and exploiting a limited amount of experimental data. The model has been then used to compare different Power-to-Gas concepts, in a real scenario of functioning. Several goals have been achieved. In the concept phase, the possibility to thermally integrate the high temperature components has been demonstrated. Then, the parameters that affect the energy performance of a Power-to-Gas system coupled with a renewable source have been identified, providing general recommendations on the design of hybrid systems; these parameters are: 1) the ratio between the storage system size and the renewable generator size; 2) the type of coupled renewable source; 3) the related production profile. Finally, from the results of the comparative analysis, it is highlighted that configurations with a highly oversized renewable source with respect to the storage system show the maximum achievable profit.
Resumo:
Bioinformatics is a recent and emerging discipline which aims at studying biological problems through computational approaches. Most branches of bioinformatics such as Genomics, Proteomics and Molecular Dynamics are particularly computationally intensive, requiring huge amount of computational resources for running algorithms of everincreasing complexity over data of everincreasing size. In the search for computational power, the EGEE Grid platform, world's largest community of interconnected clusters load balanced as a whole, seems particularly promising and is considered the new hope for satisfying the everincreasing computational requirements of bioinformatics, as well as physics and other computational sciences. The EGEE platform, however, is rather new and not yet free of problems. In addition, specific requirements of bioinformatics need to be addressed in order to use this new platform effectively for bioinformatics tasks. In my three years' Ph.D. work I addressed numerous aspects of this Grid platform, with particular attention to those needed by the bioinformatics domain. I hence created three major frameworks, Vnas, GridDBManager and SETest, plus an additional smaller standalone solution, to enhance the support for bioinformatics applications in the Grid environment and to reduce the effort needed to create new applications, additionally addressing numerous existing Grid issues and performing a series of optimizations. The Vnas framework is an advanced system for the submission and monitoring of Grid jobs that provides an abstraction with reliability over the Grid platform. In addition, Vnas greatly simplifies the development of new Grid applications by providing a callback system to simplify the creation of arbitrarily complex multistage computational pipelines and provides an abstracted virtual sandbox which bypasses Grid limitations. Vnas also reduces the usage of Grid bandwidth and storage resources by transparently detecting equality of virtual sandbox files based on content, across different submissions, even when performed by different users. BGBlast, evolution of the earlier project GridBlast, now provides a Grid Database Manager (GridDBManager) component for managing and automatically updating biological flatfile databases in the Grid environment. GridDBManager sports very novel features such as an adaptive replication algorithm that constantly optimizes the number of replicas of the managed databases in the Grid environment, balancing between response times (performances) and storage costs according to a programmed cost formula. GridDBManager also provides a very optimized automated management for older versions of the databases based on reverse delta files, which reduces the storage costs required to keep such older versions available in the Grid environment by two orders of magnitude. The SETest framework provides a way to the user to test and regressiontest Python applications completely scattered with side effects (this is a common case with Grid computational pipelines), which could not easily be tested using the more standard methods of unit testing or test cases. The technique is based on a new concept of datasets containing invocations and results of filtered calls. The framework hence significantly accelerates the development of new applications and computational pipelines for the Grid environment, and the efforts required for maintenance. An analysis of the impact of these solutions will be provided in this thesis. This Ph.D. work originated various publications in journals and conference proceedings as reported in the Appendix. Also, I orally presented my work at numerous international conferences related to Grid and bioinformatics.
Resumo:
Machine learning comprises a series of techniques for automatic extraction of meaningful information from large collections of noisy data. In many real world applications, data is naturally represented in structured form. Since traditional methods in machine learning deal with vectorial information, they require an a priori form of preprocessing. Among all the learning techniques for dealing with structured data, kernel methods are recognized to have a strong theoretical background and to be effective approaches. They do not require an explicit vectorial representation of the data in terms of features, but rely on a measure of similarity between any pair of objects of a domain, the kernel function. Designing fast and good kernel functions is a challenging problem. In the case of tree structured data two issues become relevant: kernel for trees should not be sparse and should be fast to compute. The sparsity problem arises when, given a dataset and a kernel function, most structures of the dataset are completely dissimilar to one another. In those cases the classifier has too few information for making correct predictions on unseen data. In fact, it tends to produce a discriminating function behaving as the nearest neighbour rule. Sparsity is likely to arise for some standard tree kernel functions, such as the subtree and subset tree kernel, when they are applied to datasets with node labels belonging to a large domain. A second drawback of using tree kernels is the time complexity required both in learning and classification phases. Such a complexity can sometimes prevents the kernel application in scenarios involving large amount of data. This thesis proposes three contributions for resolving the above issues of kernel for trees. A first contribution aims at creating kernel functions which adapt to the statistical properties of the dataset, thus reducing its sparsity with respect to traditional tree kernel functions. Specifically, we propose to encode the input trees by an algorithm able to project the data onto a lower dimensional space with the property that similar structures are mapped similarly. By building kernel functions on the lower dimensional representation, we are able to perform inexact matchings between different inputs in the original space. A second contribution is the proposal of a novel kernel function based on the convolution kernel framework. Convolution kernel measures the similarity of two objects in terms of the similarities of their subparts. Most convolution kernels are based on counting the number of shared substructures, partially discarding information about their position in the original structure. The kernel function we propose is, instead, especially focused on this aspect. A third contribution is devoted at reducing the computational burden related to the calculation of a kernel function between a tree and a forest of trees, which is a typical operation in the classification phase and, for some algorithms, also in the learning phase. We propose a general methodology applicable to convolution kernels. Moreover, we show an instantiation of our technique when kernels such as the subtree and subset tree kernels are employed. In those cases, Direct Acyclic Graphs can be used to compactly represent shared substructures in different trees, thus reducing the computational burden and storage requirements.
Resumo:
La valutazione dei rischi associati all’operatività dei sistemi di stoccaggio, quali la sismicità indotta e la subsidenza, è requisito basilare per una loro corretta gestione e progettazione, e passa attraverso la definizione dell’influenza sullo stato tensionale delle variazioni di pressione di poro nel sottosuolo. Principale scopo di questo progetto è lo sviluppo di una metodologia in grado di quantificare le deformazioni dei reservoir in funzione della pressione di poro, di tarare i modelli utilizzati con casi studio che presentino dati di monitoraggio reali, tali da consentire un confronto con le previsioni di modello. In questa tesi, la teoria delle inomogeneità è stata utilizzata, tramite un approccio semianalitico, per definire le variazioni dei campi elastici derivanti dalle operazioni di prelievo e immissione di fluidi in serbatoi geologici. Estensione, forma e magnitudo delle variazioni di stress indotte sono state valutate tramite il concetto di variazione dello sforzo critico secondo il criterio di rottura di Coulomb, tramite un’analisi numerica agli elementi finiti. La metodologia sviluppata è stata applicata e tarata su due reservoir sfruttati e riconvertiti a sistemi di stoccaggio che presentano dataset, geologia, petrofisica, e condizioni operative differenti. Sono state calcolate le variazioni dei campi elastici e la subsidenza; è stata mappata la variazione di sforzo critico di Coulomb per entrambi i casi. I risultati ottenuti mostrano buon accordo con le osservazioni dei monitoraggi, suggerendo la bontà della metodologia e indicando la scarsa probabilità di sismicità indotta. Questo progetto ha consentito la creazione di una piattaforma metodologica di rapido ed efficace utilizzo, per stimare l’influenza dei sistemi di stoccaggio di gas sullo stato tensionale della crosta terrestre; in fase di stoccaggio, permette di monitorare le deformazioni e gli sforzi indotti; in fase di progettazione, consente di valutare le strategie operative per monitorare e mitigare i rischi geologici associati a questi sistemi.
Resumo:
In the framework of the micro-CHP (Combined Heat and Power) energy systems and the Distributed Generation (GD) concept, an Integrated Energy System (IES) able to meet the energy and thermal requirements of specific users, using different types of fuel to feed several micro-CHP energy sources, with the integration of electric generators of renewable energy sources (RES), electrical and thermal storage systems and the control system was conceived and built. A 5 kWel Polymer Electrolyte Membrane Fuel Cell (PEMFC) has been studied. Using experimental data obtained from various measurement campaign, the electrical and CHP PEMFC system performance have been determinate. The analysis of the effect of the water management of the anodic exhaust at variable FC loads has been carried out, and the purge process programming logic was optimized, leading also to the determination of the optimal flooding times by varying the AC FC power delivered by the cell. Furthermore, the degradation mechanisms of the PEMFC system, in particular due to the flooding of the anodic side, have been assessed using an algorithm that considers the FC like a black box, and it is able to determine the amount of not-reacted H2 and, therefore, the causes which produce that. Using experimental data that cover a two-year time span, the ageing suffered by the FC system has been tested and analyzed.
Resumo:
According to recent studies, antioxidant supplementation on gamete processing and/or storage solutions improvesgamete quality parameters, after cooling or storage at sub zero temperature. The aim of the present study was to investigate the effects of antioxidant supplementation on pig and horse gamete storage. The first study aimed to determine the effects of resveratrol (RESV) on the apoptotic status of porcine oocytes vitrified by Cryotop method, evaluating phosphatidylserine (PS) exteriorization and caspases activation. RESV(2µM) was added during: IVM (A); 2 h post-warming incubation (B); vitrification/warming and 2 h post-warming incubation (C); all previous phases (D). The obtained data demonstrate that RESV supplementation in the various steps of IVM and vitrification/warming procedure can modulate the apoptotic process, improving the resistance of porcine oocytes to cryopreservation-induced damage. In the second work different concentrations of RESV (10, 20, 40, and 80µM) were added during liquid storage of stallion sperm for 24 hours at either 10°C or 4°C, under anaerobic conditions. Our findings demonstrate that RESV supplementation does not enhance sperm quality of stallion semen after 24 hours of storage. Moreover, the highest RESV concentrations tested (40 and 80µM) could damage sperm functional status, probably acting as pro-oxidant. Finally, in the third work other two antioxidants, ascorbic acid (AA) (100 µM) and glutathione (GSH) (5mM) were added on boar freezing and/or thawing solutions. In our study different sperm parameters were evaluated before freezing and at 30 and 240 minutes after thawing. Our results showed that GSH and AA significantly improved boar sperm cryotolerance, especially when supplemented together to both freezing and thawing media. This improvement was observed in sperm viability and acrosome integrity, sperm motility, and nucleoprotein structure. Although ROS levels were not much increased by freeze-thawing procedures, the addition of GSH and AA to both freezing and thawing extenders significantly decreased intracellular peroxide levels.
Resumo:
In the digital age, e-health technologies play a pivotal role in the processing of medical information. As personal health data represents sensitive information concerning a data subject, enhancing data protection and security of systems and practices has become a primary concern. In recent years, there has been an increasing interest in the concept of Privacy by Design, which aims at developing a product or a service in a way that it supports privacy principles and rules. In the EU, Article 25 of the General Data Protection Regulation provides a binding obligation of implementing Data Protection by Design technical and organisational measures. This thesis explores how an e-health system could be developed and how data processing activities could be carried out to apply data protection principles and requirements from the design stage. The research attempts to bridge the gap between the legal and technical disciplines on DPbD by providing a set of guidelines for the implementation of the principle. The work is based on literature review, legal and comparative analysis, and investigation of the existing technical solutions and engineering methodologies. The work can be differentiated by theoretical and applied perspectives. First, it critically conducts a legal analysis on the principle of PbD and it studies the DPbD legal obligation and the related provisions. Later, the research contextualises the rule in the health care field by investigating the applicable legal framework for personal health data processing. Moreover, the research focuses on the US legal system by conducting a comparative analysis. Adopting an applied perspective, the research investigates the existing technical methodologies and tools to design data protection and it proposes a set of comprehensive DPbD organisational and technical guidelines for a crucial case study, that is an Electronic Health Record system.
Resumo:
Today, the contribution of the transportation sector on greenhouse gases is evident. The fast consumption of fossil fuels and its impact on the environment has given a strong impetus to the development of vehicles with better fuel economy. Hybrid electric vehicles fit into this context with different targets, starting from the reduction of emissions and fuel consumption, but also for performance and comfort enhancement. Vehicles exist with various missions; super sport cars usually aim to reach peak performance and to guarantee a great driving experience to the driver, but great attention must also be paid to fuel consumption. According to the vehicle mission, hybrid vehicles can differ in the powertrain configuration and the choice of the energy storage system. Lamborghini has recently invested in the development of hybrid super sport cars, due to performance and comfort reasons, with the possibility to reduce fuel consumption. This research activity has been conducted as a joint collaboration between the University of Bologna and the sportscar manufacturer, to analyze the impact of innovative energy storage solutions on the hybrid vehicle performance. Capacitors have been studied and modeled to analyze the pros and cons of such solution with respect to batteries. To this aim, a full simulation environment has been developed and validated to provide a concept design tool capable of precise results and able to foresee the longitudinal performance on regulated emission cycles and real driving conditions, with a focus on fuel consumption. In addition, the target of the research activity is to deepen the study of hybrid electric super sports cars in the concept development phase, focusing on defining the control strategies and the energy storage system’s technology that best suits the needs of the vehicles. This dissertation covers the key steps that have been carried out in the research project.