4 resultados para Ressource Internet (Descripteur de forme)
em Universitätsbibliothek Kassel, Universität Kassel, Germany
Resumo:
The process of developing software that takes advantage of multiple processors is commonly referred to as parallel programming. For various reasons, this process is much harder than the sequential case. For decades, parallel programming has been a problem for a small niche only: engineers working on parallelizing mostly numerical applications in High Performance Computing. This has changed with the advent of multi-core processors in mainstream computer architectures. Parallel programming in our days becomes a problem for a much larger group of developers. The main objective of this thesis was to find ways to make parallel programming easier for them. Different aims were identified in order to reach the objective: research the state of the art of parallel programming today, improve the education of software developers about the topic, and provide programmers with powerful abstractions to make their work easier. To reach these aims, several key steps were taken. To start with, a survey was conducted among parallel programmers to find out about the state of the art. More than 250 people participated, yielding results about the parallel programming systems and languages in use, as well as about common problems with these systems. Furthermore, a study was conducted in university classes on parallel programming. It resulted in a list of frequently made mistakes that were analyzed and used to create a programmers' checklist to avoid them in the future. For programmers' education, an online resource was setup to collect experiences and knowledge in the field of parallel programming - called the Parawiki. Another key step in this direction was the creation of the Thinking Parallel weblog, where more than 50.000 readers to date have read essays on the topic. For the third aim (powerful abstractions), it was decided to concentrate on one parallel programming system: OpenMP. Its ease of use and high level of abstraction were the most important reasons for this decision. Two different research directions were pursued. The first one resulted in a parallel library called AthenaMP. It contains so-called generic components, derived from design patterns for parallel programming. These include functionality to enhance the locks provided by OpenMP, to perform operations on large amounts of data (data-parallel programming), and to enable the implementation of irregular algorithms using task pools. AthenaMP itself serves a triple role: the components are well-documented and can be used directly in programs, it enables developers to study the source code and learn from it, and it is possible for compiler writers to use it as a testing ground for their OpenMP compilers. The second research direction was targeted at changing the OpenMP specification to make the system more powerful. The main contributions here were a proposal to enable thread-cancellation and a proposal to avoid busy waiting. Both were implemented in a research compiler, shown to be useful in example applications, and proposed to the OpenMP Language Committee.
Resumo:
Der Nationalsozialismus und damit auch der Holocaust gilt als die am besten erforschte Periode der deutschen Geschichte. Unzählige Berichte und Dokumente belegen den Völkermord an den europäischen Juden und ermöglichen so ein genaues und detailliertes Bild der Vorgänge. Trotz der sehr guten Quellenlage behaupten Holocaustleugner, dass es sich bei der Shoah um eine Inszenierung handele oder dass die geschätzten Opferzahlen als maßlose Übertreibung zurückzuweisen seien. Die vorliegende Studie untersucht, wie Holocaustleugner argumentieren und mit welchen Manipulationstechniken sie historische Tatsachen verfälschen. Im Zentrum stehen dabei propagandistische Texte im Internet, dem Medium, welches gegenwärtig als häufigster Verbreitungskanal für holocaustleugnende Propaganda genutzt wird. Um aktuelle Tendenzen deutlich zu machen und um Brüche und Kontinuitäten herauszuarbeiten, werden jüngste Internet-Publikationen mit Printmedien aus den 1970er und 1980er Jahren verglichen. Die Analyse macht dabei deutlich, dass sich holocaustleugnende Argumentationsmuster mit der „digitalen Revolution“ gewandelt haben und die Protagonisten der Szene sich auf neue Zielgruppen einstellen. Während frühe Printmedien vor allem für einen begrenzten Kreis einschlägig Interessierter publiziert wurden, haben Holocaustleugner heute die Gesamtheit der Internet-Nutzer als Zielgruppe erkannt. Vor diesem Hintergrund wandeln sich die Verschleierungstaktiken und Täuschungsmanöver, auch aber der Habitus der Texte. Argumentierten die Autoren in früheren Veröffentlichungen oftmals offensiv und radikal, konzentrieren sie sich gegenwärtig auf moderatere Argumentationsmuster, die darauf abzielen, die Shoah zu trivialisieren und zu minimieren. Derartige Propagandaformen sind kompatibler mit dem politischem Mainstream, weil sie weniger verschwörungstheoretisch angelegt sind und ihr antisemitisches Motiv besser verbergen können. Radikale Holocaustleugnung, die behauptet, der gesamte wissenschaftliche Erkenntnisbestand zur Shoah sei ein Phantasiegebilde, findet sich seltener im Internet. Häufiger wird eine „Nadelstich-Taktik“ verfolgt, die einzelne Detailaspekte aufgreift, in Frage stellt oder zu widerlegen vorgibt. Diese Angriffe sollen ihre Wirkung nicht für sich entfalten, sondern in der Summe suggerieren, dass die Tatsachenbasis des Holocaust durchaus hinterfragenswert sei.
Resumo:
Despite its young history, Computer Science Education has seen a number of "revolutions". Being a veteran in the field, the author reflects on the many changes he has seen in computing and its teaching. The intent of this personal collection is to point out that most revolutions came unforeseen and that many of the new learning initiatives, despite high financial input, ultimately failed. The author then considers the current revolution (MOOC, inverted lectures, peer instruction, game design) and, based on the lessons learned earlier, argues why video recording is so successful. Given the fact that this is the decade we lost print (papers, printed books, book shops, libraries), the author then conjectures that the impact of the Internet will make this revolution different from previous ones in that most of the changes are irreversible. As a consequence he warns against storming ahead blindly and suggests to conserve - while it is still possible - valuable components of what might soon be called the antebellum age of education.