16 resultados para Models performance
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
My project explores and compares different forms of gender performance in contemporary art and visual culture according to a perspective centered on photography. Thanks to its attesting power this medium can work as a ready-made. In fact during the 20th century it played a key role in the cultural emancipation of the body which (using a Michel Foucault’s expression) has now become «the zero point of the world». Through performance the body proves to be a living material of expression and communication while photography ensures the recording of any ephemeral event that happens in time and space. My questioning approach considers the gender constructed imagery from the 1990s to the present in order to investigate how photography’s strong aura of realism promotes and allows fantasies of transformation. The contemporary fascination with gender (especially for art and fashion) represents a crucial issue in the global context of postmodernity and is manifested in a variety of visual media, from photography to video and film. Moreover the internet along with its digital transmission of images has deeply affected our world (from culture to everyday life) leading to a postmodern preference for performativity over the more traditional and linear forms of narrativity. As a consequence individual borders get redefined by the skin itself which (dissected through instant vision) turns into a ductile material of mutation and hybridation in the service of identity. My critical assumptions are taken from the most relevant changes occurred in philosophy during the last two decades as a result of the contributions by Jacques Lacan, Michel Foucault, Jacques Derrida, Gilles Deleuze who developed a cross-disciplinary and comparative approach to interpret the crisis of modernity. They have profoundly influenced feminist studies so that the category of gender has been reassessed in contrast with sex (as a biological connotation) and in relation to history, culture, society. The ideal starting point of my research is the year 1990. I chose it as the approximate historical moment when the intersection of race, class and gender were placed at the forefront of international artistic production concerned with identity, diversity and globalization. Such issues had been explored throughout the 1970s but it was only from the mid-1980s onward that they began to be articulated more consistently. Published in 1990, the book "Gender trouble: feminism and the subversion of identity" by Judith Butler marked an important breakthrough by linking gender to performance as well as investigating the intricate connections between theory and practice, embodiment and representation. It inspired subsequent research in a variety of disciplines, art history included. In the same year Teresa de Lauretis launched the definition of queer theory to challenge the academic perspective in gay and lesbian studies. In the meantime the rise of Third Wave Feminism in the US introduced a racially and sexually inclusive vision over the global situation in order to reflect on subjectivity, new technologies and popular culture in connection with gender representation. These conceptual tools have enabled prolific readings of contemporary cultural production whether fine arts or mass media. After discussing the appropriate framework of my project and taking into account the postmodern globalization of the visual, I have turned to photography to map gender representation both in art and in fashion. Therefore I have been creating an archive of images around specific topics. I decided to include fashion photography because in the 1990s this genre moved away from the paradigm of an idealized and classical beauty toward a new vernacular allied with lifestyles, art practices, pop and youth culture; as one might expect the dominant narrative modes in fashion photography are now mainly influenced by cinema and snapshot. These strategies originate story lines and interrupted narratives using models’ performance to convey a particular imagery where identity issues emerge as an essential part of fashion spectacle. Focusing on the intersections of gender identities with socially and culturally produced identities, my approach intends to underline how the fashion world has turned to current trends in art photography and in some case turned to the artists themselves. The growing fluidity of the categories that distinguish art from fashion photography represents a particularly fruitful moment of visual exchange. Varying over time the dialogue between these two fields has always been vital; nowadays it can be studied as a result of this close relationship between contemporary art world and consumer culture. Due to the saturation of postmodern imagery the feedback between art and fashion has become much more immediate and then increasingly significant for anyone who wants to investigate the construction of gender identity through performance. In addition to that a lot of magazines founded in the 1990s bridged the worlds of art and fashion because some of their designers and even editors were art-school graduates encouraging innovation. The inclusion of art within such magazines aimed at validating them as a form of art in themselves supporting a dynamic intersection for music, fashion, design and youth culture: an intersection that also contributed to create and spread different gender stereotypes. This general interest in fashion produced many exhibitions of and about fashion itself at major international venues such as the Victoria and Albert Museum in London, the Metropolitan Museum of Art and the Solomon R. Guggenheim Museum in New York. Since then this celebrated success of fashion has been regarded as a typical element of postmodern culture. Owing to that I have also based my analysis on some important exhibitions dealing with gender performance like "Féminin-Masculin" at the Centre Pompidou of Paris (1995), "Rrose is a Rrose is a Rrose. Gender performance in photography" at the Solomon R. Guggenheim Museum of New York (1997), "Global Feminisms" at the Brooklyn Museum (2007), "Female Trouble" at the Pinakothek der Moderne in München together with the workshops dedicated to "Performance: gender and identity" in June 2005 at the Tate Modern of London. Since 2003 in Italy we have had Gender Bender - an international festival held annually in Bologna - to explore the gender imagery stemming from contemporary culture. In few days this festival offers a series of events ranging from visual arts, performance, cinema, literature to conferences and music. Being aware that any method of research is neither race nor gender neutral I have traced these critical paths to question gender identity in a multicultural perspective taking account of the political implications too. In fact, if visibility may be equated with exposure, we can also read these images as points of intersection of visibility with social power. Since gender assignations rely so heavily on the visual, the postmodern dismantling of gender certainty through performance has wide-ranging effects that need to be analyzed. In some sense this practice can even contest the dominance of visual within postmodernism. My visual map in contemporary art and fashion photography includes artists like Nan Goldin, Cindy Sherman, Hellen van Meene, Rineke Dijkstra, Ed Templeton, Ryan McGinley, Anne Daems, Miwa Yanagi, Tracey Moffat, Catherine Opie, Tomoko Sawada, Vanessa Beecroft, Yasumasa Morimura, Collier Schorr among others.
Resumo:
Deep learning methods are extremely promising machine learning tools to analyze neuroimaging data. However, their potential use in clinical settings is limited because of the existing challenges of applying these methods to neuroimaging data. In this study, first a data leakage type caused by slice-level data split that is introduced during training and validation of a 2D CNN is surveyed and a quantitative assessment of the model’s performance overestimation is presented. Second, an interpretable, leakage-fee deep learning software written in a python language with a wide range of options has been developed to conduct both classification and regression analysis. The software was applied to the study of mild cognitive impairment (MCI) in patients with small vessel disease (SVD) using multi-parametric MRI data where the cognitive performance of 58 patients measured by five neuropsychological tests is predicted using a multi-input CNN model taking brain image and demographic data. Each of the cognitive test scores was predicted using different MRI-derived features. As MCI due to SVD has been hypothesized to be the effect of white matter damage, DTI-derived features MD and FA produced the best prediction outcome of the TMT-A score which is consistent with the existing literature. In a second study, an interpretable deep learning system aimed at 1) classifying Alzheimer disease and healthy subjects 2) examining the neural correlates of the disease that causes a cognitive decline in AD patients using CNN visualization tools and 3) highlighting the potential of interpretability techniques to capture a biased deep learning model is developed. Structural magnetic resonance imaging (MRI) data of 200 subjects was used by the proposed CNN model which was trained using a transfer learning-based approach producing a balanced accuracy of 71.6%. Brain regions in the frontal and parietal lobe showing the cerebral cortex atrophy were highlighted by the visualization tools.
Resumo:
The "sustainability" concept relates to the prolonging of human economic systems with as little detrimental impact on ecological systems as possible. Construction that exhibits good environmental stewardship and practices that conserve resources in a manner that allow growth and development to be sustained for the long-term without degrading the environment are indispensable in a developed society. Past, current and future advancements in asphalt as an environmentally sustainable paving material are especially important because the quantities of asphalt used annually in Europe as well as in the U.S. are large. The asphalt industry is still developing technological improvements that will reduce the environmental impact without affecting the final mechanical performance. Warm mix asphalt (WMA) is a type of asphalt mix requiring lower production temperatures compared to hot mix asphalt (HMA), while aiming to maintain the desired post construction properties of traditional HMA. Lowering the production temperature reduce the fuel usage and the production of emissions therefore and that improve conditions for workers and supports the sustainable development. Even the crumb-rubber modifier (CRM), with shredded automobile tires and used in the United States since the mid 1980s, has proven to be an environmentally friendly alternative to conventional asphalt pavement. Furthermore, the use of waste tires is not only relevant in an environmental aspect but also for the engineering properties of asphalt [Pennisi E., 1992]. This research project is aimed to demonstrate the dual value of these Asphalt Mixes in regards to the environmental and mechanical performance and to suggest a low environmental impact design procedure. In fact, the use of eco-friendly materials is the first phase towards an eco-compatible design but it cannot be the only step. The eco-compatible approach should be extended also to the design method and material characterization because only with these phases is it possible to exploit the maximum potential properties of the used materials. Appropriate asphalt concrete characterization is essential and vital for realistic performance prediction of asphalt concrete pavements. Volumetric (Mix design) and mechanical (Permanent deformation and Fatigue performance) properties are important factors to consider. Moreover, an advanced and efficient design method is necessary in order to correctly use the material. A design method such as a Mechanistic-Empirical approach, consisting of a structural model capable of predicting the state of stresses and strains within the pavement structure under the different traffic and environmental conditions, was the application of choice. In particular this study focus on the CalME and its Incremental-Recursive (I-R) procedure, based on damage models for fatigue and permanent shear strain related to the surface cracking and to the rutting respectively. It works in increments of time and, using the output from one increment, recursively, as input to the next increment, predicts the pavement conditions in terms of layer moduli, fatigue cracking, rutting and roughness. This software procedure was adopted in order to verify the mechanical properties of the study mixes and the reciprocal relationship between surface layer and pavement structure in terms of fatigue and permanent deformation with defined traffic and environmental conditions. The asphalt mixes studied were used in a pavement structure as surface layer of 60 mm thickness. The performance of the pavement was compared to the performance of the same pavement structure where different kinds of asphalt concrete were used as surface layer. In comparison to a conventional asphalt concrete, three eco-friendly materials, two warm mix asphalt and a rubberized asphalt concrete, were analyzed. The First Two Chapters summarize the necessary steps aimed to satisfy the sustainable pavement design procedure. In Chapter I the problem of asphalt pavement eco-compatible design was introduced. The low environmental impact materials such as the Warm Mix Asphalt and the Rubberized Asphalt Concrete were described in detail. In addition the value of a rational asphalt pavement design method was discussed. Chapter II underlines the importance of a deep laboratory characterization based on appropriate materials selection and performance evaluation. In Chapter III, CalME is introduced trough a specific explanation of the different equipped design approaches and specifically explaining the I-R procedure. In Chapter IV, the experimental program is presented with a explanation of test laboratory devices adopted. The Fatigue and Rutting performances of the study mixes are shown respectively in Chapter V and VI. Through these laboratory test data the CalME I-R models parameters for Master Curve, fatigue damage and permanent shear strain were evaluated. Lastly, in Chapter VII, the results of the asphalt pavement structures simulations with different surface layers were reported. For each pavement structure, the total surface cracking, the total rutting, the fatigue damage and the rutting depth in each bound layer were analyzed.
Resumo:
The topic of the Ph.D project focuses on the modelling of the soil-water dynamics inside an instrumented embankment section along Secchia River (Cavezzo (MO)) in the period from 2017 to 2018 and the quantification of the performance of the direct and indirect simulations . The commercial code Hydrus2D by Pc-Progress has been chosen to run the direct simulations. Different soil-hydraulic models have been adopted and compared. The parameters of the different hydraulic models are calibrated using a local optimization method based on the Levenberg - Marquardt algorithm implemented in the Hydrus package. The calibration program is carried out using different types of dataset of observation points, different weighting distributions, different combinations of optimized parameters and different initial sets of parameters. The final goal is an in-depth study of the potentialities and limits of the inverse analysis when applied to a complex geotechnical problem as the case study. The second part of the research focuses on the effects of plant roots and soil-vegetation-atmosphere interaction on the spatial and temporal distribution of pore water pressure in soil. The investigated soil belongs to the West Charlestown Bypass embankment, Newcastle, Australia, that showed in the past years shallow instabilities and the use of long stem planting is intended to stabilize the slope. The chosen plant species is the Malaleuca Styphelioides, native of eastern Australia. The research activity included the design and realization of a specific large scale apparatus for laboratory experiments. Local suction measurements at certain intervals of depth and radial distances from the root bulb are recorded within the vegetated soil mass under controlled boundary conditions. The experiments are then reproduced numerically using the commercial code Hydrus 2D. Laboratory data are used to calibrate the RWU parameters and the parameters of the hydraulic model.
Resumo:
The accurate representation of the Earth Radiation Budget by General Circulation Models (GCMs) is a fundamental requirement to provide reliable historical and future climate simulations. In this study, we found reasonable agreement between the integrated energy fluxes at the top of the atmosphere simulated by 34 state-of-the-art climate models and the observations provided by the Cloud and Earth Radiant Energy System (CERES) mission on a global scale, but large regional biases have been detected throughout the globe. Furthermore, we highlighted that a good agreement between simulated and observed integrated Outgoing Longwave Radiation (OLR) fluxes may be obtained from the cancellation of opposite-in-sign systematic errors, localized in different spectral ranges. To avoid this and to understand the causes of these biases, we compared the observed Earth emission spectra, measured by the Infrared Atmospheric Sounding Interferometer (IASI) in the period 2008-2016, with the synthetic radiances computed on the basis of the atmospheric fields provided by the EC-Earth GCM. To this purpose, the fast σ-IASI radiative transfer model was used, after its validation and implementation in EC-Earth. From the comparison between observed and simulated spectral radiances, a positive temperature bias in the stratosphere and a negative temperature bias in the middle troposphere, as well as a dry bias of the water vapor concentration in the upper troposphere, have been identified in the EC-Earth climate model. The analysis has been performed in clear-sky conditions, but the feasibility of its extension in the presence of clouds, whose impact on the radiation represents the greatest source of uncertainty in climate models, has also been proven. Finally, the analysis of simulated and observed OLR trends indicated good agreement and provided detailed information on the spectral fingerprints of the evolution of the main climate variables.
Resumo:
The Assimilation in the Unstable Subspace (AUS) was introduced by Trevisan and Uboldi in 2004, and developed by Trevisan, Uboldi and Carrassi, to minimize the analysis and forecast errors by exploiting the flow-dependent instabilities of the forecast-analysis cycle system, which may be thought of as a system forced by observations. In the AUS scheme the assimilation is obtained by confining the analysis increment in the unstable subspace of the forecast-analysis cycle system so that it will have the same structure of the dominant instabilities of the system. The unstable subspace is estimated by Breeding on the Data Assimilation System (BDAS). AUS- BDAS has already been tested in realistic models and observational configurations, including a Quasi-Geostrophicmodel and a high dimensional, primitive equation ocean model; the experiments include both fixed and“adaptive”observations. In these contexts, the AUS-BDAS approach greatly reduces the analysis error, with reasonable computational costs for data assimilation with respect, for example, to a prohibitive full Extended Kalman Filter. This is a follow-up study in which we revisit the AUS-BDAS approach in the more basic, highly nonlinear Lorenz 1963 convective model. We run observation system simulation experiments in a perfect model setting, and with two types of model error as well: random and systematic. In the different configurations examined, and in a perfect model setting, AUS once again shows better efficiency than other advanced data assimilation schemes. In the present study, we develop an iterative scheme that leads to a significant improvement of the overall assimilation performance with respect also to standard AUS. In particular, it boosts the efficiency of regime’s changes tracking, with a low computational cost. Other data assimilation schemes need estimates of ad hoc parameters, which have to be tuned for the specific model at hand. In Numerical Weather Prediction models, tuning of parameters — and in particular an estimate of the model error covariance matrix — may turn out to be quite difficult. Our proposed approach, instead, may be easier to implement in operational models.
Resumo:
To date the hospital radiological workflow is completing a transition from analog to digital technology. Since the X-rays digital detection technologies have become mature, hospitals are trading on the natural devices turnover to replace the conventional screen film devices with digital ones. The transition process is complex and involves not just the equipment replacement but also new arrangements for image transmission, display (and reporting) and storage. This work is focused on 2D digital detector’s characterization with a concern to specific clinical application; the systems features linked to the image quality are analyzed to assess the clinical performances, the conversion efficiency, and the minimum dose necessary to get an acceptable image. The first section overviews the digital detector technologies focusing on the recent and promising technological developments. The second section contains a description of the characterization methods considered in this thesis categorized in physical, psychophysical and clinical; theory, models and procedures are described as well. The third section contains a set of characterizations performed on new equipments that appears to be some of the most advanced technologies available to date. The fourth section deals with some procedures and schemes employed for quality assurance programs.
Resumo:
To continuously improve the performance of metal-oxide-semiconductor field-effect-transistors (MOSFETs), innovative device architectures, gate stack engineering and mobility enhancement techniques are under investigation. In this framework, new physics-based models for Technology Computer-Aided-Design (TCAD) simulation tools are needed to accurately predict the performance of upcoming nanoscale devices and to provide guidelines for their optimization. In this thesis, advanced physically-based mobility models for ultrathin body (UTB) devices with either planar or vertical architectures such as single-gate silicon-on-insulator (SOI) field-effect transistors (FETs), double-gate FETs, FinFETs and silicon nanowire FETs, integrating strain technology and high-κ gate stacks are presented. The effective mobility of the two-dimensional electron/hole gas in a UTB FETs channel is calculated taking into account its tensorial nature and the quantization effects. All the scattering events relevant for thin silicon films and for high-κ dielectrics and metal gates have been addressed and modeled for UTB FETs on differently oriented substrates. The effects of mechanical stress on (100) and (110) silicon band structures have been modeled for a generic stress configuration. Performance will also derive from heterogeneity, coming from the increasing diversity of functions integrated on complementary metal-oxide-semiconductor (CMOS) platforms. For example, new architectural concepts are of interest not only to extend the FET scaling process, but also to develop innovative sensor applications. Benefiting from properties like large surface-to-volume ratio and extreme sensitivity to surface modifications, silicon-nanowire-based sensors are gaining special attention in research. In this thesis, a comprehensive analysis of the physical effects playing a role in the detection of gas molecules is carried out by TCAD simulations combined with interface characterization techniques. The complex interaction of charge transport in silicon nanowires of different dimensions with interface trap states and remote charges is addressed to correctly reproduce experimental results of recently fabricated gas nanosensors.
Resumo:
Main objective of the dissertation is to illustrate how social and educational aspects (in close interaction with other multifunctional aspects in organic agriculture) which are developed on different multifunctional organic farms in Italy and Netherlands, as well as established agricultural policy frameworks in these countries, can be compared with the situation in Croatian organics and can contribute to further developent of organic issues in the Repubic of Croatia. So, through different chapters, the dissertation describes the performance of organic agriculture sectors in Italy, Netherlands and Croatia within the national agricultural policy frameworks, it analyzes the role of national institutions and policy in Croatia in connection with Croatia's status of candidate country for enterance into EU and harmonization of legislation with the CAP, as well as analyzes what is the role of national authorities, universities, research centres, but also of private initiatives, NGOs and cooperatives in organic agriculture in Netherlands, Italy and Croatia. Its main part describes how social and educational aspects are interacting with other multifunctional aspects in organic agriculture and analyzes the benefits and contribution of multifunctional activites performed on organic farms to education, healthy nourishment, environment protection and health care. It also assess the strengths and weaknesses of organic agriculture in all researched countries. The dissertation concludes with development opportunities for multifunctional organic agriculture in Croatia, as well as giving perspectives and recommendations for different approaches on the basis of experiences learned from successful EU models accompanied with some personal ideas and proposals.
Resumo:
Pancreatic islet transplantation represents a fascinating procedure that, at the moment, can be considered as alternative to standard insulin treatment or pancreas transplantation only for selected categories of patients with type 1 diabetes mellitus. Among the factors responsible for leading to poor islet engraftment, hypoxia plays an important role. Mesenchymal stem cells (MSCs) were recently used in animal models of islet transplantation not only to reduce allograft rejection, but also to promote revascularization. Currently adipose tissue represents a novel and good source of MSCs. Moreover, the capability of adipose-derived stem cells (ASCs) to improve islet graft revascularization was recently reported after hybrid transplantation in mice. Within this context, we have previously shown that hyaluronan esters of butyric and retinoic acids can significantly enhance the rescuing potential of human MSCs. Here we evaluated whether ex vivo preconditioning of human ASCs (hASCs) with a mixture of hyaluronic (HA), butyric (BU), and retinoic (RA) acids may result in optimization of graft revascularization after islet/stem cell intrahepatic cotransplantation in syngeneic diabetic rats. We demonstrated that hASCs exposed to the mixture of molecules are able to increase the secretion of vascular endothelial growth factor (VEGF), as well as the transcription of angiogenic genes, including VEGF, KDR (kinase insert domain receptor), and hepatocyte growth factor (HGF). Rats transplanted with islets cocultured with preconditioned hASCs exhibited a better glycemic control than rats transplanted with an equal volume of islets and control hASCs. Cotransplantation with preconditioned hASCs was also associated with enhanced islet revascularization in vivo, as highlighted by graft morphological analysis. The observed increase in islet graft revascularization and function suggests that our method of stem cell preconditioning may represent a novel strategy to remarkably improve the efficacy of islets-hMSCs cotransplantation.
Resumo:
In this Thesis a series of numerical models for the evaluation of the seasonal performance of reversible air-to-water heat pump systems coupled to residential and non-residential buildings are presented. The exploitation of the energy saving potential linked to the adoption of heat pumps is a hard task for designers due to the influence on their energy performance of several factors, like the external climate variability, the heat pump modulation capacity, the system control strategy and the hydronic loop configuration. The aim of this work is to study in detail all these aspects. In the first part of this Thesis a series of models which use a temperature class approach for the prediction of the seasonal performance of reversible air source heat pumps are shown. An innovative methodology for the calculation of the seasonal performance of an air-to-water heat pump has been proposed as an extension of the procedure reported by the European standard EN 14825. This methodology can be applied not only to air-to-water single-stage heat pumps (On-off HPs) but also to multi-stage (MSHPs) and inverter-driven units (IDHPs). In the second part, dynamic simulation has been used with the aim to optimize the control systems of the heat pump and of the HVAC plant. A series of dynamic models, developed by means of TRNSYS, are presented to study the behavior of On-off HPs, MSHPs and IDHPs. The main goal of these dynamic simulations is to show the influence of the heat pump control strategies and of the lay-out of the hydronic loop used to couple the heat pump to the emitters on the seasonal performance of the system. A particular focus is given to the modeling of the energy losses linked to on-off cycling.
Resumo:
Fibre Reinforced Concretes are innovative composite materials whose applications are growing considerably nowadays. Being composite materials, their performance depends on the mechanical properties of both components, fibre and matrix and, above all, on the interface. The variables to account for the mechanical characterization of the material, could be proper of the material itself, i.e. fibre and concrete type, or external factors, i.e. environmental conditions. The first part of the research presented is focused on the experimental and numerical characterization of the interface properties and short term response of fibre reinforced concretes with macro-synthetic fibers. The experimental database produced represents the starting point for numerical models calibration and validation with two principal purposes: the calibration of a local constitutive law and calibration and validation of a model predictive of the whole material response. In the perspective of the design of sustainable admixtures, the optimization of the matrix of cement-based fibre reinforced composites is realized with partial substitution of the cement amount. In the second part of the research, the effect of time dependent phenomena on MSFRCs response is studied. An extended experimental campaign of creep tests is performed analysing the effect of time and temperature variations in different loading conditions. On the results achieved, a numerical model able to account for the viscoelastic nature of both concrete and reinforcement, together with the environmental conditions, is calibrated with the LDPM theory. Different type of regression models are also elaborated correlating the mechanical properties investigated, bond strength and residual flexural behaviour, regarding the short term analysis and creep coefficient on time, for the time dependent behaviour, with the variable investigated. The experimental studies carried out emphasize the several aspects influencing the material mechanical performance allowing also the identification of those properties that the numerical approach should consider in order to be reliable.
Resumo:
Modern scientific discoveries are driven by an unsatisfiable demand for computational resources. High-Performance Computing (HPC) systems are an aggregation of computing power to deliver considerably higher performance than one typical desktop computer can provide, to solve large problems in science, engineering, or business. An HPC room in the datacenter is a complex controlled environment that hosts thousands of computing nodes that consume electrical power in the range of megawatts, which gets completely transformed into heat. Although a datacenter contains sophisticated cooling systems, our studies indicate quantitative evidence of thermal bottlenecks in real-life production workload, showing the presence of significant spatial and temporal thermal and power heterogeneity. Therefore minor thermal issues/anomalies can potentially start a chain of events that leads to an unbalance between the amount of heat generated by the computing nodes and the heat removed by the cooling system originating thermal hazards. Although thermal anomalies are rare events, anomaly detection/prediction in time is vital to avoid IT and facility equipment damage and outage of the datacenter, with severe societal and business losses. For this reason, automated approaches to detect thermal anomalies in datacenters have considerable potential. This thesis analyzed and characterized the power and thermal characteristics of a Tier0 datacenter (CINECA) during production and under abnormal thermal conditions. Then, a Deep Learning (DL)-powered thermal hazard prediction framework is proposed. The proposed models are validated against real thermal hazard events reported for the studied HPC cluster while in production. This thesis is the first empirical study of thermal anomaly detection and prediction techniques of a real large-scale HPC system to the best of my knowledge. For this thesis, I used a large-scale dataset, monitoring data of tens of thousands of sensors for around 24 months with a data collection rate of around 20 seconds.
Resumo:
Silicon-based discrete high-power devices need to be designed with optimal performance up to several thousand volts and amperes to reach power ratings ranging from few kWs to beyond the 1 GW mark. To this purpose, a key element is the improvement of the junction termination (JT) since it allows to drastically reduce surface electric field peaks which may lead to an earlier device failure. This thesis will be mostly focused on the negative bevel termination which from several years constitutes a standard processing step in bipolar production lines. A simple methodology to realize its counterpart, a planar JT with variation of the lateral doping concentration (VLD) will be also described. On the JT a thin layer of a semi insulating material is usually deposited, which acts as passivation layer reducing the interface defects and contributing to increase the device reliability. A thorough understanding of how the passivation layer properties affect the breakdown voltage and the leakage current of a fast-recovery diode is fundamental to preserve the ideal termination effect and provide a stable blocking capability. More recently, amorphous carbon, also called diamond-like carbon (DLC), has been used as a robust surface passivation material. By using a commercial TCAD tool, a detailed physical explanation of DLC electrostatic and transport properties has been provided. The proposed approach is able to predict the breakdown voltage and the leakage current of a negative beveled power diode passivated with DLC as confirmed by the successfully validation against the available experiments. In addition, the VLD JT proposed to overcome the limitation of the negative bevel architecture has been simulated showing a breakdown voltage very close to the ideal one with a much smaller area consumption. Finally, the effect of a low junction depth on the formation of current filaments has been analyzed by performing reverse-recovery simulations.
Resumo:
Model misspecification affects the classical test statistics used to assess the fit of the Item Response Theory (IRT) models. Robust tests have been derived under model misspecification, as the Generalized Lagrange Multiplier and Hausman tests, but their use has not been largely explored in the IRT framework. In the first part of the thesis, we introduce the Generalized Lagrange Multiplier test to detect differential item response functioning in IRT models for binary data under model misspecification. By means of a simulation study and a real data analysis, we compare its performance with the classical Lagrange Multiplier test, computed using the Hessian and the cross-product matrix, and the Generalized Jackknife Score test. The power of these tests is computed empirically and asymptotically. The misspecifications considered are local dependence among items and non-normal distribution of the latent variable. The results highlight that, under mild model misspecification, all tests have good performance while, under strong model misspecification, the performance of the tests deteriorates. None of the tests considered show an overall superior performance than the others. In the second part of the thesis, we extend the Generalized Hausman test to detect non-normality of the latent variable distribution. To build the test, we consider a seminonparametric-IRT model, that assumes a more flexible latent variable distribution. By means of a simulation study and two real applications, we compare the performance of the Generalized Hausman test with the M2 limited information goodness-of-fit test and the Likelihood-Ratio test. Additionally, the information criteria are computed. The Generalized Hausman test has a better performance than the Likelihood-Ratio test in terms of Type I error rates and the M2 test in terms of power. The performance of the Generalized Hausman test and the information criteria deteriorates when the sample size is small and with a few items.