260 resultados para algorithmic
Resumo:
Abstract. Dendritic cells are antigen presenting cells that provide a vital link between the innate and adaptive immune system. Research into this family of cells has revealed that they perform the role of coordinating T-cell based immune responses, both reactive and for generating tolerance. We have derived an algorithm based on the functionality of these cells, and have used the signals and differentiation pathways to build a control mechanism for an artificial immune system. We present our algorithmic details in addition to some preliminary results, where the algorithm was applied for the purpose of anomaly detection. We hope that this algorithm will eventually become the key component within a large, distributed immune system, based on sound immunological concepts.
Resumo:
En este trabajo se presenta un estudio exploratorio sobre prácticas de aula, relacionadas con las magnitudes longitud, tiempo y masa, llevadas a cabo en Educación Primaria en Portugal. El estudio fijó como objetivos determinar qué objetos y procesos matemáticos están implicados en esas prácticas y qué funciones ejecutan profesor y alumnos durante la realización de las mismas. Los resultados han evidenciado el predominio del conocimiento procedimental y algorítmico y el uso de situaciones extramatemáticas o de la vida cotidiana. El profesor es el gestor sistemático del trabajo de los alumnos así como de los tiempos, espacios y materiales disponibles en el aula.
Resumo:
A poster of this paper will be presented at the 25th International Conference on Parallel Architecture and Compilation Technology (PACT ’16), September 11-15, 2016, Haifa, Israel.
Resumo:
The majority of research work carried out in the field of Operations-Research uses methods and algorithms to optimize the pick-up and delivery problem. Most studies aim to solve the vehicle routing problem, to accommodate optimum delivery orders, vehicles etc. This paper focuses on green logistics approach, where existing Public Transport infrastructure capability of a city is used for the delivery of small and medium sized packaged goods thus, helping improve the situation of urban congestion and greenhouse gas emissions reduction. It carried out a study to investigate the feasibility of the proposed multi-agent based simulation model, for efficiency of cost, time and energy consumption. Multimodal Dijkstra Shortest Path algorithm and Nested Monte Carlo Search have been employed for a two-phase algorithmic approach used for generation of time based cost matrix. The quality of the tour is dependent on the efficiency of the search algorithm implemented for plan generation and route planning. The results reveal a definite advantage of using Public Transportation over existing delivery approaches in terms of energy efficiency.
Resumo:
O esclarecimento do órgão de origem, ou principalmente da natureza benigna ou maligna de um tumor anexial ecograficamente indeterminado surge com frequência na prática clínica e obriga a uma investigação complementar da sua natureza de modo a evitar cirurgias inúteis, sem deixar de diagnosticar precocemente a patologia maligna. Neste trabalho apresentamos um algoritmo de diagnóstico por Ressonância Magnética (RM), que recentemente foi proposto pela European Society of Urogenital Radiology (ESUR), o qual permite um diagnóstico preciso dos casos indeterminados à ecografia e uma intervenção responsável na abordagem clínica destas doentes.
Resumo:
Réalisé en cotutelle avec l'École normale supérieure de Cachan – Université Paris-Saclay
Resumo:
La actual realidad socioeconómica, marcada por la (r)evolución tecnológica de los último años y la explosión demográfica y urbana, conlleva dos grandes problemas. Por un lado el cambio climático derivado de la sobreexplotación de los recursos y energías no-renovables, y por otro, la pérdida de las identidades y procesos culturales específicos provocada por la globalización. Ante ellos, diversos autores plantean sacar partido de las propias tecnologías y la nueva sociedad en red para dar una respuesta acorde al momento actual. Las herramientas computacionales permiten una mayor complejidad de los diseños alcanzando una optimización de recursos y procesos, minimizando su impacto ambiental. Frente a la producción en masa y la pérdida de identidad, el planteamiento informático de problemas globales permite pasar de la producción en masa del siglo pasado a la ‘customización’ en masa al dar respuestas específicas para cada contexto. Por otro lado es necesario que esos procesos computacionales conecten y hagan partícipes del diseño a los diferentes actores sociales implicados. Es por ello que esta investigación se basará en los patrones espaciales de Christopher Alexander y otros modelos algorítmicos de diseño por ordenador puesto que estos describen soluciones paramétricas a conflictos recurrentes de diseño de arquitectura. Su planteamiento permite que cada solución base genere respuestas específicas, a la vez que esta es corregida y optimizada por todos sus utilizadores al poder ser compartida digitalmente. Con ello se busca que el diseño de arquitectura responda a criterios objetivos basados en la experiencia y la crítica participativa y democrática basada en los patrones, de tal modo que los diseños no surjan de un planteamiento top-down impuesto y cerrado, sino que en ellos gane importancia la participación activa de los actores sociales implicados en la definición y uso de los mismos. Por último, esta investigación procura mostrar cómo los patrones pueden jugar un papel determinante en la conceptualización abstracta del diseño, mientras que otros métodos algorítmicos alcanzarán fases del proyecto más concretas. De este modo, los patrones digitales que se pretenden se centran en la customización del diseño, mientras que el uso que le dan otros autores persigue la optimización del mismo. Para ello la investigación recurrirá al análisis de los pabellones de verano de la Serpentine Gallery como casos de estudio en los que comprobar la repercusión de los patrones en el diseño de arquitectura actual y su posible adaptación al diseño paramétrico.
Resumo:
Bilinear pairings can be used to construct cryptographic systems with very desirable properties. A pairing performs a mapping on members of groups on elliptic and genus 2 hyperelliptic curves to an extension of the finite field on which the curves are defined. The finite fields must, however, be large to ensure adequate security. The complicated group structure of the curves and the expensive field operations result in time consuming computations that are an impediment to the practicality of pairing-based systems. The Tate pairing can be computed efficiently using the ɳT method. Hardware architectures can be used to accelerate the required operations by exploiting the parallelism inherent to the algorithmic and finite field calculations. The Tate pairing can be performed on elliptic curves of characteristic 2 and 3 and on genus 2 hyperelliptic curves of characteristic 2. Curve selection is dependent on several factors including desired computational speed, the area constraints of the target device and the required security level. In this thesis, custom hardware processors for the acceleration of the Tate pairing are presented and implemented on an FPGA. The underlying hardware architectures are designed with care to exploit available parallelism while ensuring resource efficiency. The characteristic 2 elliptic curve processor contains novel units that return a pairing result in a very low number of clock cycles. Despite the more complicated computational algorithm, the speed of the genus 2 processor is comparable. Pairing computation on each of these curves can be appealing in applications with various attributes. A flexible processor that can perform pairing computation on elliptic curves of characteristic 2 and 3 has also been designed. An integrated hardware/software design and verification environment has been developed. This system automates the procedures required for robust processor creation and enables the rapid provision of solutions for a wide range of cryptographic applications.
Resumo:
In this paper, we consider Preference Inference based on a generalised form of Pareto order. Preference Inference aims at reasoning over an incomplete specification of user preferences. We focus on two problems. The Preference Deduction Problem (PDP) asks if another preference statement can be deduced (with certainty) from a set of given preference statements. The Preference Consistency Problem (PCP) asks if a set of given preference statements is consistent, i.e., the statements are not contradicting each other. Here, preference statements are direct comparisons between alternatives (strict and non-strict). It is assumed that a set of evaluation functions is known by which all alternatives can be rated. We consider Pareto models which induce order relations on the set of alternatives in a Pareto manner, i.e., one alternative is preferred to another only if it is preferred on every component of the model. We describe characterisations for deduction and consistency based on an analysis of the set of evaluation functions, and present algorithmic solutions and complexity results for PDP and PCP, based on Pareto models in general and for a special case. Furthermore, a comparison shows that the inference based on Pareto models is less cautious than some other types of well-known preference model.
Resumo:
Sanctum is a public art work by James Coupe and Juan Pampin. It uses the persistent flow of people around the Henry Art Gallery as input, extracting narratives from the demographics of passers-by and the patterns of their movement. The flow of people is used as a physical analogue to another type of crowd, the virtual inhabitants of social networks such as Facebook.
Resumo:
A history of specialties in economics since the late 1950s is constructed on the basis of a large corpus of documents from economics journals. The production of this history relies on a combination of algorithmic methods that avoid subjective assessments of the boundaries of specialties: bibliographic coupling, automated community detection in dynamic networks and text mining. these methods uncover a structuring of economics around recognizable specialties with some significant changes over the time-period covered (1956-2014). Among our results, especially noteworthy are (a) the clearcut existence of 10 families of specialties, (b) the disappearance in the late 1970s of a specialty focused on general economic theory, (c) the dispersal of the econometrics-centered specialty in the early 1990s and the ensuing importance of specific econometric methods for the identity of many specialties since the 1990s, (d) the low level of specialization of individual economists throughout the period in contrast to physicists as early as the late 1960s.
Resumo:
Um semigrupo numérico é um submonoide de (N, +) tal que o seu complementar em N é finito. Neste trabalho estudamos alguns invariantes de um semigrupo numérico S tais como: multiplicidade, dimensão de imersão, número de Frobenius, falhas e conjunto Apéry de S. Caracterizamos uma apresentação minimal para um semigrupo numérico S e descrevemos um método algorítmico para determinar esta apresentação. Definimos um semigrupo numérico irredutível como um semigrupo numérico que não pode ser expresso como intersecção de dois semigrupos numéricos que o contenham propriamente. A finalizar este trabalho, estudamos os semigrupos numéricos irredutíveis e obtemos a decomposição de um semigrupo numérico em irredutíveis. ABSTRACT: A numerical semigroup is a submonoid of (N, +) such that its complement of N is finite. ln this work we study some invariants of a numerical semigroup S such as: multiplicity, embedding dimension, Frobenius number, gaps and Apéry set of S. We characterize a minimal presentation of a numerical semigroup S and describe an algorithmic procedure which allows us to compute a minimal presentation of S. We define an irreducible numerical semigroup as a numerical semigroup that cannot be expressed as the intersection of two numerical semigroups properly containing it. Concluding this work, we study and characterize irreducible numerical semigroups, and describe methods for computing decompositions of a numerical semigroup into irreducible numerical semigroups.
Resumo:
We investigated how participants associated with each other and developed community in a Massive Open Online Course (MOOC) about Rhizomatic Learning (Rhizo14). We compared learner experiences in two social networking sites (SNSs), Facebook and Twitter. Our combination of thematic analysis of qualitative survey data with analysis of participant observation, activity data, archives and visualisation of SNS data enabled us to reach a deeper understanding of participant perspectives and explore SNS use. Community was present in the course title and understood differently by participants. In the absence of explanation or discussion about community early in the MOOC, a controversy between participants about course expectations emerged that created oppositional discourse. Fall off in activity in MOOCs is common and was evident in Rhizo14. As the course progressed, fewer participants were active in Facebook and some participants reported feelings of exclusion. Despite this, activity in Facebook increased overall. The top 10 most active participants were responsible for 47% of total activity. In the Rhizo14 MOOC, both community and curriculum were expected to emerge within the course. We suggest that there are tensions and even contradictions between ‘Community Is the Curriculum’ and Deleuze and Guattari's principles of the rhizome, mainly focussed on an absence of heterogeneity. These tensions may be exacerbated by SNSs that use algorithmic streams. We propose the use of networking approaches that enable negotiation and exchange to encourage heterogeneity rather than emergent definition of community.
Resumo:
Le conseguenze del management algoritmico sui lavoratori sono note tra gli studiosi, ma poche ricerche indagano le possibilità di agency, soprattutto a livello individuale, nella gig-economy. A partire dalla quotidianità del lavoro, l’obiettivo è analizzare le forme di agency esercitate dai platform workers nel settore della logistica dell'ultimo miglio. La ricerca si basa su un'etnografia multi-situata condotta in due paesi molto distanti e riguardante due diversi servizi urbani di piattaforma: il food-delivery in Italia (Bologna, Torino) e il ride-hailing in Argentina (Buenos Aires). Nonostante le differenze, il lavoro di campo ha mostrato diverse continuità tra i contesti geografici. Innanzitutto, le tecnologie digitali giocano un ruolo ambivalente nell'ambiente di lavoro: se la tecnologia è usata dalle aziende per disciplinare il lavoro, costituisce però anche uno strumento che può essere impiegato a vantaggio dei lavoratori. Sia nel ride-hailing che nelle piattaforme di food-delivery, infatti, i lavoratori esprimono la loro agency condividendo pratiche di rimaneggiamento e tattiche per aggirare il despotismo algoritmico. In secondo luogo, la ricerca ha portato alla luce una gran varietà di attività economiche sviluppate ai margini dell'economia di piattaforma. In entrambi i casi le piattaforme intersecano vivacemente le economie informali urbane e alimentano circuiti informali di lavoro, come evidenziato dall'elevata presenza di scambi illeciti: ad esempio, vendita di account, hacking-bots, caporalato digitale. Tutt'altro che avviare un processo di formalizzazione, quindi, la piattaforma sussume e riproduce l’insieme di condizioni produttive e riproduttive dell'informalità (viração), offrendo impieghi intermittenti e insicuri a una massa di lavoratori-usa-e-getta disponibile al sottoimpiego. In conclusione, le piattaforme vengono definite come infrastrutture barocche, intendendo con il barocco tanto la natura ibrida dell'azione che mescola forme di neoliberismo-dal-basso con pratiche di solidarietà tra pari, quanto la progressiva ristrutturazione dei processi di accumulazione all’insegna di una rinnovata interdipendenza tra formale e informale nelle infrastrutture del «mondo a domicilio».
Resumo:
One of the most visionary goals of Artificial Intelligence is to create a system able to mimic and eventually surpass the intelligence observed in biological systems including, ambitiously, the one observed in humans. The main distinctive strength of humans is their ability to build a deep understanding of the world by learning continuously and drawing from their experiences. This ability, which is found in various degrees in all intelligent biological beings, allows them to adapt and properly react to changes by incrementally expanding and refining their knowledge. Arguably, achieving this ability is one of the main goals of Artificial Intelligence and a cornerstone towards the creation of intelligent artificial agents. Modern Deep Learning approaches allowed researchers and industries to achieve great advancements towards the resolution of many long-standing problems in areas like Computer Vision and Natural Language Processing. However, while this current age of renewed interest in AI allowed for the creation of extremely useful applications, a concerningly limited effort is being directed towards the design of systems able to learn continuously. The biggest problem that hinders an AI system from learning incrementally is the catastrophic forgetting phenomenon. This phenomenon, which was discovered in the 90s, naturally occurs in Deep Learning architectures where classic learning paradigms are applied when learning incrementally from a stream of experiences. This dissertation revolves around the Continual Learning field, a sub-field of Machine Learning research that has recently made a comeback following the renewed interest in Deep Learning approaches. This work will focus on a comprehensive view of continual learning by considering algorithmic, benchmarking, and applicative aspects of this field. This dissertation will also touch on community aspects such as the design and creation of research tools aimed at supporting Continual Learning research, and the theoretical and practical aspects concerning public competitions in this field.