17 resultados para Chain of value


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Modern scientific discoveries are driven by an unsatisfiable demand for computational resources. High-Performance Computing (HPC) systems are an aggregation of computing power to deliver considerably higher performance than one typical desktop computer can provide, to solve large problems in science, engineering, or business. An HPC room in the datacenter is a complex controlled environment that hosts thousands of computing nodes that consume electrical power in the range of megawatts, which gets completely transformed into heat. Although a datacenter contains sophisticated cooling systems, our studies indicate quantitative evidence of thermal bottlenecks in real-life production workload, showing the presence of significant spatial and temporal thermal and power heterogeneity. Therefore minor thermal issues/anomalies can potentially start a chain of events that leads to an unbalance between the amount of heat generated by the computing nodes and the heat removed by the cooling system originating thermal hazards. Although thermal anomalies are rare events, anomaly detection/prediction in time is vital to avoid IT and facility equipment damage and outage of the datacenter, with severe societal and business losses. For this reason, automated approaches to detect thermal anomalies in datacenters have considerable potential. This thesis analyzed and characterized the power and thermal characteristics of a Tier0 datacenter (CINECA) during production and under abnormal thermal conditions. Then, a Deep Learning (DL)-powered thermal hazard prediction framework is proposed. The proposed models are validated against real thermal hazard events reported for the studied HPC cluster while in production. This thesis is the first empirical study of thermal anomaly detection and prediction techniques of a real large-scale HPC system to the best of my knowledge. For this thesis, I used a large-scale dataset, monitoring data of tens of thousands of sensors for around 24 months with a data collection rate of around 20 seconds.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The notion of commodification is a fascinating one. It entails many facets, ranging from subjective debates on desirability of commodification to in depth economic analyses of objects of value and their corresponding markets. Commodity theory is therefore not just defined by a single debate, but spans a plethora of different discussions. This thesis maps and situates those theories and debates and selects one specific strain to investigate further. This thesis argues that commodity theory in its optima forma deals with the investigation into what sets commodities apart from non-commodities. It proceeds to examine the many given answers to this question by scholars ranging from the mid 1800’s to the late 2000’s. Ultimately, commodification is defined as a process in which an object becomes an element of the total wealth of societies in which the capitalist mode of production prevails. In doing so, objects must meet observables, or indicia, of commodification provided by commodity theories. Problems arise when objects are clearly part of the total wealth in societies without meeting established commodity indicia. In such cases, objects are part of the total wealth of a society without counting as a commodity. This thesis examines this phenomenon in relation to the novel commodities of audiences and data. It explains how these non-commodities (according to classical theories) are still essential elements of industry. The thesis then takes a deep dive into commodity theory using the theory on the construction of social reality by John Searle.