794 resultados para Distributed Representations
Resumo:
High Performance Computing e una tecnologia usata dai cluster computazionali per creare sistemi di elaborazione che sono in grado di fornire servizi molto piu potenti rispetto ai computer tradizionali. Di conseguenza la tecnologia HPC e diventata un fattore determinante nella competizione industriale e nella ricerca. I sistemi HPC continuano a crescere in termini di nodi e core. Le previsioni indicano che il numero dei nodi arrivera a un milione a breve. Questo tipo di architettura presenta anche dei costi molto alti in termini del consumo delle risorse, che diventano insostenibili per il mercato industriale. Un scheduler centralizzato non e in grado di gestire un numero di risorse cosi alto, mantenendo un tempo di risposta ragionevole. In questa tesi viene presentato un modello di scheduling distribuito che si basa sulla programmazione a vincoli e che modella il problema dello scheduling grazie a una serie di vincoli temporali e vincoli sulle risorse che devono essere soddisfatti. Lo scheduler cerca di ottimizzare le performance delle risorse e tende ad avvicinarsi a un profilo di consumo desiderato, considerato ottimale. Vengono analizzati vari modelli diversi e ognuno di questi viene testato in vari ambienti.
Resumo:
La simulazione è definita come la rappresentazione del comportamento di un sistema o di un processo per mezzo del funzionamento di un altro o, alternativamente, dall'etimologia del verbo “simulare”, come la riproduzione di qualcosa di fittizio, irreale, come se in realtà, lo fosse. La simulazione ci permette di modellare la realtà ed esplorare soluzioni differenti e valutare sistemi che non possono essere realizzati per varie ragioni e, inoltre, effettuare differenti valutazioni, dinamiche per quanto concerne la variabilità delle condizioni. I modelli di simulazione possono raggiungere un grado di espressività estremamente elevato, difficilmente un solo calcolatore potrà soddisfare in tempi accettabili i risultati attesi. Una possibile soluzione, viste le tendenze tecnologiche dei nostri giorni, è incrementare la capacità computazionale tramite un’architettura distribuita (sfruttando, ad esempio, le possibilità offerte dal cloud computing). Questa tesi si concentrerà su questo ambito, correlandolo ad un altro argomento che sta guadagnando, giorno dopo giorno, sempre più rilevanza: l’anonimato online. I recenti fatti di cronaca hanno dimostrato quanto una rete pubblica, intrinsecamente insicura come l’attuale Internet, non sia adatta a mantenere il rispetto di confidenzialità, integrità ed, in alcuni, disponibilità degli asset da noi utilizzati: nell’ambito della distribuzione di risorse computazionali interagenti tra loro, non possiamo ignorare i concreti e molteplici rischi; in alcuni sensibili contesti di simulazione (e.g., simulazione militare, ricerca scientifica, etc.) non possiamo permetterci la diffusione non controllata dei nostri dati o, ancor peggio, la possibilità di subire un attacco alla disponibilità delle risorse coinvolte. Essere anonimi implica un aspetto estremamente rilevante: essere meno attaccabili, in quanto non identificabili.
Resumo:
Compliant mechanisms with evenly distributed stresses have better load-bearing ability and larger range of motion than mechanisms with compliance and stresses lumped at flexural hinges. In this paper, we present a metric to quantify how uniformly the strain energy of deformation and thus the stresses are distributed throughout the mechanism topology. The resulting metric is used to optimize cross-sections of conceptual compliant topologies leading to designs with maximal stress distribution. This optimization framework is demonstrated for both single-port mechanisms and single-input single-output mechanisms. It is observed that the optimized designs have lower stresses than their nonoptimized counterparts, which implies an ability for single-port mechanisms to store larger strain energy, and single-input single-output mechanisms to perform larger output work before failure.
Resumo:
This thesis explores system performance for reconfigurable distributed systems and provides an analytical model for determining throughput of theoretical systems based on the OpenSPARC FPGA Board and the SIRC Communication Framework. This model was developed by studying a small set of variables that together determine a system¿s throughput. The importance of this model is in assisting system designers to make decisions as to whether or not to commit to designing a reconfigurable distributed system based on the estimated performance and hardware costs. Because custom hardware design and distributed system design are both time consuming and costly, it is important for designers to make decisions regarding system feasibility early in the development cycle. Based on experimental data the model presented in this paper shows a close fit with less than 10% experimental error on average. The model is limited to a certain range of problems, but it can still be used given those limitations and also provides a foundation for further development of modeling reconfigurable distributed systems.
Resumo:
This thesis presents two frameworks- a software framework and a hardware core manager framework- which, together, can be used to develop a processing platform using a distributed system of field-programmable gate array (FPGA) boards. The software framework providesusers with the ability to easily develop applications that exploit the processing power of FPGAs while the hardware core manager framework gives users the ability to configure and interact with multiple FPGA boards and/or hardware cores. This thesis describes the design and development of these frameworks and analyzes the performance of a system that was constructed using the frameworks. The performance analysis included measuring the effect of incorporating additional hardware components into the system and comparing the system to a software-only implementation. This work draws conclusions based on the provided results of the performance analysis and offers suggestions for future work.
Resumo:
For as far back as human history can be traced, mankind has questioned what it means to be human. One of the most common approaches throughout Western culture's intellectual tradition in attempts to answering this question has been to compare humans with or against other animals. I argue that it was not until Charles Darwin's publication of The Descent of Man and Selection in Relation to Sex (1871) that Western culture was forced to seriously consider human identity in relation to the human/ nonhuman primate line. Since no thinker prior to Charles Darwin had caused such an identity crisis in Western thought, this interdisciplinary analysis of the history of how the human/ nonhuman primate line has been understood focuses on the reciprocal relationship of popular culture and scientific representations from 1871 to the Human Genome Consortium in 2000. Focusing on the concept coined as the "Darwin-Müller debate," representations of the human/ nonhuman primate line are traced through themes of language, intelligence, and claims of variation throughout the popular texts: Descent of Man, The Jungle Books (1894), Tarzan of the Apes (1914), and Planet of the Apes (1963). Additional themes such as the nature versus nurture debate and other comparative phenotypic attributes commonly used for comparison between man and apes are also analyzed. Such popular culture representations are compared with related or influential scientific research during the respective time period of each text to shed light on the reciprocal nature of Western intellectual tradition, popular notions of the human/ nonhuman primate line, and the development of the field of primatology. Ultimately this thesis shows that the Darwin-Müller debate is indeterminable, and such a lack of resolution makes man uncomfortable. Man's unsettled response and desire for self-knowledge further facilitates a continued search for answers to human identity. As the Human Genome Project has led to the rise of new debates, and primate research has become less anthropocentric over time, the mysteries of man's future have become more concerning than the questions of our past. The human/ nonhuman primate line is reduced to a 1% difference, and new debates have begun to overshadow the Darwin-Müller debate. In conclusion, I argue that human identity is best represented through the metaphor of evolution: both have an unknown beginning, both have an indeterminable future with no definite end, and like a species under the influence of evolution, what it means to be human is a constant, indeterminable process of change.
Resumo:
Cezary Trutkowski (Poland). Social representations of Politics. Mr. Trutkowski is an assistant in the Institute for Social Research of the University of Warsaw and worked on this research from August 1997 to July 1999. Research into social representations requires the use of various methods to show the phenomenon from different angles. In contrast with the contemporary trend towards the measurement of attitudes in opinion surveys, Trutkowski examined social representations of politics in order to study the problem of political participation. He thus aimed to study a problem which he sees as social in nature from a really sociological perspective. His research revealed a very distinctive difference between the social representations of politics shared by politicians and by the general public. The former consider politics in general as struggle for power which can be used to pursue their own vision of social order. They perceive political activity as a kind of homage to be paid to the greater idea that they pursue. At the same time, politicians admitted that politics is very often treated by political actors as a means of gaining personal profit, becoming purely a fight for the power to rule. According to voters, politicians should be treated as employees on a contract, the duration of which depends on their performance. Politics is therefore not a homage to an idea but a service to citizens. They did however admit that this is wishful thinking in Poland today. A content analysis of electoral campaigns confirmed the dark side of the representation of politics: politicians spend the campaign making promises and presenting visions on television, and quarrelling and fighting in everyday activities that were reported by the press. Trutkowski sees his research as a contribution to the foundations of a new approach to studying mass phenomena. By reviving some forgotten ideas of Durkheim and the Chicago School and incorporating the theory of social representations with a new methodological programme, he believes that it is possible to build a more social social psychology and sociology.
Transient rhythmic network activity in the somatosensory cortex evoked by distributed input in vitro
Resumo:
The initiation and maintenance of physiological and pathophysiological oscillatory activity depends on the synaptic interactions within neuronal networks. We studied the mechanisms underlying evoked transient network oscillation in acute slices of the adolescent rat somatosensory cortex and modeled its underpinning mechanisms. Oscillations were evoked by brief spatially distributed noisy extracellular stimulation, delivered via bipolar electrodes. Evoked transient network oscillation was detected with multi-neuron patch-clamp recordings under different pharmacological conditions. The observed oscillations are in the frequency range of 2-5 Hz and consist of 4-12 mV large, 40-150 ms wide compound synaptic events with rare overlying action potentials. This evoked transient network oscillation is only weakly expressed in the somatosensory cortex and requires increased [K+]o of 6.25 mM and decreased [Ca2+]o of 1.5 mM and [Mg2+]o of 0.5 mM. A peak in the cross-correlation among membrane potential in layers II/III, IV and V neurons reflects the underlying network-driven basis of the evoked transient network oscillation. The initiation of the evoked transient network oscillation is accompanied by an increased [K+]o and can be prevented by the K+ channel blocker quinidine. In addition, a shift of the chloride reversal potential takes place during stimulation, resulting in a depolarizing type A GABA (GABAA) receptor response. Blockade of alpha-amino-3-hydroxy-5-methyl-4-isoxazole-proprionate (AMPA), N-methyl-D-aspartate (NMDA), or GABA(A) receptors as well as gap junctions prevents evoked transient network oscillation while a reduction of AMPA or GABA(A) receptor desensitization increases its duration and amplitude. The apparent reversal potential of -27 mV of the evoked transient network oscillation, its pharmacological profile, as well as the modeling results suggest a mixed contribution of glutamatergic, excitatory GABAergic, and gap junctional conductances in initiation and maintenance of this oscillatory activity. With these properties, evoked transient network oscillation resembles epileptic afterdischarges more than any other form of physiological or pathophysiological neocortical oscillatory activity.