503 resultados para algorithmic skeletons
Resumo:
In this paper we advocate the Loop-of-stencil-reduce pattern as a way to simplify the parallel programming of heterogeneous platforms (multicore+GPUs). Loop-of-Stencil-reduce is general enough to subsume map, reduce, map-reduce, stencil, stencil-reduce, and, crucially, their usage in a loop. It transparently targets (by using OpenCL) combinations of CPU cores and GPUs, and it makes it possible to simplify the deployment of a single stencil computation kernel on different GPUs. The paper discusses the implementation of Loop-of-stencil-reduce within the FastFlow parallel framework, considering a simple iterative data-parallel application as running example (Game of Life) and a highly effective parallel filter for visual data restoration to assess performance. Thanks to the high-level design of the Loop-of-stencil-reduce, it was possible to run the filter seamlessly on a multicore machine, on multi-GPUs, and on both.
Resumo:
Réalisé en cotutelle avec l'École normale supérieure de Cachan – Université Paris-Saclay
Resumo:
Rising anthropogenic CO2 in the atmosphere is accompanied by an increase in oceanic CO2 and a concomitant decline in seawater pH (ref. 1). This phenomenon, known as ocean acidification (OA), has been experimentally shown to impact the biology and ecology of numerous animals and plants2, most notably those that precipitate calcium carbonate skeletons, such as reef-building corals3. Volcanically acidified water at Maug, Commonwealth of the Northern Mariana Islands (CNMI) is equivalent to near-future predictions for what coral reef ecosystems will experience worldwide due to OA. We provide the first chemical and ecological assessment of this unique site and show that acidification-related stress significantly influences the abundance and diversity of coral reef taxa, leading to the often-predicted shift from a coral to an algae-dominated state4, 5. This study provides field evidence that acidification can lead to macroalgae dominance on reefs.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-07
Resumo:
A history of specialties in economics since the late 1950s is constructed on the basis of a large corpus of documents from economics journals. The production of this history relies on a combination of algorithmic methods that avoid subjective assessments of the boundaries of specialties: bibliographic coupling, automated community detection in dynamic networks, and text mining. These methods uncover a structuring of economics around recognizable specialties with some significant changes over the period covered (1956–2014). Among our results, especially noteworthy are (1) the clear-cut existence of ten families of specialties, (2) the disappearance in the late 1970s of a specialty focused on general economic theory, (3) the dispersal of the econometrics-centered specialty in the early 1990s and the ensuing importance of specific econometric methods for the identity of many specialties since the 1990s, and (4) the low level of specialization of individual economists throughout the period in contrast to physicists as early as the late 1960s.
Resumo:
Résumé: INTRODUCTION Si les cliniciens enseignants détectent aisément les difficultés des apprenants, ils sont souvent peu outillés pour les étapes subséquentes, du diagnostic à la remédiation. Quoique des outils aient été développés pour les guider face aux difficultés de raisonnement clinique de leurs apprenants, ces outils peuvent être moins familiers des cliniciens et moins adaptés à des contextes de supervision ponctuelle et de soins aigus comme l’urgence. Nous avons donc développé une application algorithmique, à partir de la taxonomie d’Audétat et al. (2010), pour guider les cliniciens enseignants juste-à-temps face aux difficultés de raisonnement clinique. MÉTHODOLOGIE Une étude descriptive interprétative a été réalisée afin d’évaluer l’utilité, l’acceptabilité et la faisabilité d’utiliser cette application à l’urgence. Des entrevues semi-dirigées ont été menées auprès d’un échantillon de convenance de douze urgentistes, avant et après une période d’essai de l’outil de trois mois. RÉSULTATS L’application a été perçue comme particulièrement utile pour préciser les difficultés de raisonnement clinique des apprenants. Utiliser l’outil a été considérée acceptable et faisable en contexte d’urgence, en particulier grâce au format mobile. DISCUSSION Ces résultats suggèrent que l’outil peut être considéré utile pour faciliter l’identification des difficultés des apprenants, mais aussi pour offrir un soutien professoral accessible. Le format mobile et algorithmique semble avoir été un facteur facilitant, ce format étant déjà utilisé par les cliniciens pour consulter ponctuellement de l’information lors de la résolution de problèmes cliniques. CONCLUSION L’étude a démontré globalement une bonne utilité, acceptabilité et faisabilité de l’outil dans un contexte de supervision ponctuelle en soins aigus, ce qui soutient son utilisation par les cliniciens enseignants dans ce contexte. L’étude corrobore également l’intérêt d’un format mobile et algorithmique pour favoriser le transfert de connaissances en pédagogie médicale.
Resumo:
O presente trabalho envolveu a produção de membranas compósitas para separação de CO2 a altas temperaturas. Os compósitos habituais são constituídos por duas fases, uma cerâmica, de céria dopada com gadolínio (Ce0.9Gd0.1O0.95 - CGO) condutora de iões óxido, que funciona como suporte da segunda fase composta por uma mistura eutética de carbonatos alcalinos (Li2CO3 e Na2CO3), que assegura o transporte de iões carbonato. O objetivo do trabalho prende-se com o estudo do transporte de iões através destes compósitos, por forma a perceber se os sais destes compósitos apresentam condução iónica singular ou condução mista. Neste sentido a resposta a esta questão teve por base a realização de ensaios de eficiência faradaica com recurso a amostras compósitas envolvendo matrizes de CGO (condutor de iões óxido) e de aluminato de lítio (não condutor de iões óxido). A preparação tanto de esqueletos porosos como de compósitos foi realizada tendo por base métodos e precursores semelhantes aos usados na literatura. Primeiramente efetuou-se o processamento dos esqueletos porosos para posteriormente impregnação com mistura eutética de carbonatos. Obtidos os compósitos estes foram caraterizados por microscopia de impedância e por microscopia eletrónica de varrimento de forma a serem submetidos mais tarde aos ensaios de eficiência faradaica. Os resultados de eficiência faradaica revelaram que na realidade existem processos de condução mista cuja importância depende das condições de operação da membrana.
Resumo:
Abstract. Dendritic cells are antigen presenting cells that provide a vital link between the innate and adaptive immune system. Research into this family of cells has revealed that they perform the role of coordinating T-cell based immune responses, both reactive and for generating tolerance. We have derived an algorithm based on the functionality of these cells, and have used the signals and differentiation pathways to build a control mechanism for an artificial immune system. We present our algorithmic details in addition to some preliminary results, where the algorithm was applied for the purpose of anomaly detection. We hope that this algorithm will eventually become the key component within a large, distributed immune system, based on sound immunological concepts.
Resumo:
This thesis presents approximation algorithms for some NP-Hard combinatorial optimization problems on graphs and networks; in particular, we study problems related to Network Design. Under the widely-believed complexity-theoretic assumption that P is not equal to NP, there are no efficient (i.e., polynomial-time) algorithms that solve these problems exactly. Hence, if one desires efficient algorithms for such problems, it is necessary to consider approximate solutions: An approximation algorithm for an NP-Hard problem is a polynomial time algorithm which, for any instance of the problem, finds a solution whose value is guaranteed to be within a multiplicative factor of the value of an optimal solution to that instance. We attempt to design algorithms for which this factor, referred to as the approximation ratio of the algorithm, is as small as possible. The field of Network Design comprises a large class of problems that deal with constructing networks of low cost and/or high capacity, routing data through existing networks, and many related issues. In this thesis, we focus chiefly on designing fault-tolerant networks. Two vertices u,v in a network are said to be k-edge-connected if deleting any set of k − 1 edges leaves u and v connected; similarly, they are k-vertex connected if deleting any set of k − 1 other vertices or edges leaves u and v connected. We focus on building networks that are highly connected, meaning that even if a small number of edges and nodes fail, the remaining nodes will still be able to communicate. A brief description of some of our results is given below. We study the problem of building 2-vertex-connected networks that are large and have low cost. Given an n-node graph with costs on its edges and any integer k, we give an O(log n log k) approximation for the problem of finding a minimum-cost 2-vertex-connected subgraph containing at least k nodes. We also give an algorithm of similar approximation ratio for maximizing the number of nodes in a 2-vertex-connected subgraph subject to a budget constraint on the total cost of its edges. Our algorithms are based on a pruning process that, given a 2-vertex-connected graph, finds a 2-vertex-connected subgraph of any desired size and of density comparable to the input graph, where the density of a graph is the ratio of its cost to the number of vertices it contains. This pruning algorithm is simple and efficient, and is likely to find additional applications. Recent breakthroughs on vertex-connectivity have made use of algorithms for element-connectivity problems. We develop an algorithm that, given a graph with some vertices marked as terminals, significantly simplifies the graph while preserving the pairwise element-connectivity of all terminals; in fact, the resulting graph is bipartite. We believe that our simplification/reduction algorithm will be a useful tool in many settings. We illustrate its applicability by giving algorithms to find many trees that each span a given terminal set, while being disjoint on edges and non-terminal vertices; such problems have applications in VLSI design and other areas. We also use this reduction algorithm to analyze simple algorithms for single-sink network design problems with high vertex-connectivity requirements; we give an O(k log n)-approximation for the problem of k-connecting a given set of terminals to a common sink. We study similar problems in which different types of links, of varying capacities and costs, can be used to connect nodes; assuming there are economies of scale, we give algorithms to construct low-cost networks with sufficient capacity or bandwidth to simultaneously support flow from each terminal to the common sink along many vertex-disjoint paths. We further investigate capacitated network design, where edges may have arbitrary costs and capacities. Given a connectivity requirement R_uv for each pair of vertices u,v, the goal is to find a low-cost network which, for each uv, can support a flow of R_uv units of traffic between u and v. We study several special cases of this problem, giving both algorithmic and hardness results. In addition to Network Design, we consider certain Traveling Salesperson-like problems, where the goal is to find short walks that visit many distinct vertices. We give a (2 + epsilon)-approximation for Orienteering in undirected graphs, achieving the best known approximation ratio, and the first approximation algorithm for Orienteering in directed graphs. We also give improved algorithms for Orienteering with time windows, in which vertices must be visited between specified release times and deadlines, and other related problems. These problems are motivated by applications in the fields of vehicle routing, delivery and transportation of goods, and robot path planning.
Resumo:
Dendritic cells are antigen presenting cells that provide a vital link between the innate and adaptive immune system, providing the initial detection of pathogenic invaders. Research into this family of cells has revealed that they perform information fusion which directs immune responses. We have derived a Dendritic Cell Algorithm based on the functionality of these cells, by modelling the biological signals and differentiation pathways to build a control mechanism for an artificial immune system. We present algorithmic details in addition to experimental results, when the algorithm was applied to anomaly detection for the detection of port scans. The results show the Dendritic Cell Algorithm is successful at detecting port scans.
Resumo:
Dendritic cells are antigen presenting cells that provide a vital link between the innate and adaptive immune system. Research into this family of cells has revealed that they perform the role of coordinating T-cell based immune responses, both reactive and for generating tolerance. We have derived an algorithm based on the functionality of these cells, and have used the signals and differentiation pathways to build a control mechanism for an artificial immune system. We present our algorithmic details in addition to some preliminary results, where the algorithm was applied for the purpose of anomaly detection. We hope that this algorithm will eventually become the key component within a large, distributed immune system, based on sound imnological concepts.
Resumo:
Artificial immune systems have previously been applied to the problem of intrusion detection. The aim of this research is to develop an intrusion detection system based on the function of Dendritic Cells (DCs). DCs are antigen presenting cells and key to the activation of the human immune system, behaviour which has been abstracted to form the Dendritic Cell Algorithm (DCA). In algorithmic terms, individual DCs perform multi-sensor data fusion, asynchronously correlating the fused data signals with a secondary data stream. Aggregate output of a population of cells is analysed and forms the basis of an anomaly detection system. In this paper the DCA is applied to the detection of outgoing port scans using TCP SYN packets. Results show that detection can be achieved with the DCA, yet some false positives can be encountered when simultaneously scanning and using other network services. Suggestions are made for using adaptive signals to alleviate this uncovered problem.
Resumo:
Artificial immune systems, more specifically the negative selection algorithm, have previously been applied to intrusion detection. The aim of this research is to develop an intrusion detection system based on a novel concept in immunology, the Danger Theory. Dendritic Cells (DCs) are antigen presenting cells and key to the activation of the human immune system. DCs perform the vital role of combining signals from the host tissue and correlate these signals with proteins known as antigens. In algorithmic terms, individual DCs perform multi-sensor data fusion based on time-windows. The whole population of DCs asynchronously correlates the fused signals with a secondary data stream. The behaviour of human DCs is abstracted to form the DC Algorithm (DCA), which is implemented using an immune inspired framework, libtissue. This system is used to detect context switching for a basic machine learning dataset and to detect outgoing portscans in real-time. Experimental results show a significant difference between an outgoing portscan and normal traffic.
Resumo:
Abstract. Dendritic cells are antigen presenting cells that provide a vital link between the innate and adaptive immune system. Research into this family of cells has revealed that they perform the role of coordinating T-cell based immune responses, both reactive and for generating tolerance. We have derived an algorithm based on the functionality of these cells, and have used the signals and differentiation pathways to build a control mechanism for an artificial immune system. We present our algorithmic details in addition to some preliminary results, where the algorithm was applied for the purpose of anomaly detection. We hope that this algorithm will eventually become the key component within a large, distributed immune system, based on sound immunological concepts.