503 resultados para algorithmic skeletons
Resumo:
En este trabajo se presenta un estudio exploratorio sobre prácticas de aula, relacionadas con las magnitudes longitud, tiempo y masa, llevadas a cabo en Educación Primaria en Portugal. El estudio fijó como objetivos determinar qué objetos y procesos matemáticos están implicados en esas prácticas y qué funciones ejecutan profesor y alumnos durante la realización de las mismas. Los resultados han evidenciado el predominio del conocimiento procedimental y algorítmico y el uso de situaciones extramatemáticas o de la vida cotidiana. El profesor es el gestor sistemático del trabajo de los alumnos así como de los tiempos, espacios y materiales disponibles en el aula.
Resumo:
A poster of this paper will be presented at the 25th International Conference on Parallel Architecture and Compilation Technology (PACT ’16), September 11-15, 2016, Haifa, Israel.
Resumo:
The majority of research work carried out in the field of Operations-Research uses methods and algorithms to optimize the pick-up and delivery problem. Most studies aim to solve the vehicle routing problem, to accommodate optimum delivery orders, vehicles etc. This paper focuses on green logistics approach, where existing Public Transport infrastructure capability of a city is used for the delivery of small and medium sized packaged goods thus, helping improve the situation of urban congestion and greenhouse gas emissions reduction. It carried out a study to investigate the feasibility of the proposed multi-agent based simulation model, for efficiency of cost, time and energy consumption. Multimodal Dijkstra Shortest Path algorithm and Nested Monte Carlo Search have been employed for a two-phase algorithmic approach used for generation of time based cost matrix. The quality of the tour is dependent on the efficiency of the search algorithm implemented for plan generation and route planning. The results reveal a definite advantage of using Public Transportation over existing delivery approaches in terms of energy efficiency.
Resumo:
O esclarecimento do órgão de origem, ou principalmente da natureza benigna ou maligna de um tumor anexial ecograficamente indeterminado surge com frequência na prática clínica e obriga a uma investigação complementar da sua natureza de modo a evitar cirurgias inúteis, sem deixar de diagnosticar precocemente a patologia maligna. Neste trabalho apresentamos um algoritmo de diagnóstico por Ressonância Magnética (RM), que recentemente foi proposto pela European Society of Urogenital Radiology (ESUR), o qual permite um diagnóstico preciso dos casos indeterminados à ecografia e uma intervenção responsável na abordagem clínica destas doentes.
Resumo:
Réalisé en cotutelle avec l'École normale supérieure de Cachan – Université Paris-Saclay
Resumo:
La actual realidad socioeconómica, marcada por la (r)evolución tecnológica de los último años y la explosión demográfica y urbana, conlleva dos grandes problemas. Por un lado el cambio climático derivado de la sobreexplotación de los recursos y energías no-renovables, y por otro, la pérdida de las identidades y procesos culturales específicos provocada por la globalización. Ante ellos, diversos autores plantean sacar partido de las propias tecnologías y la nueva sociedad en red para dar una respuesta acorde al momento actual. Las herramientas computacionales permiten una mayor complejidad de los diseños alcanzando una optimización de recursos y procesos, minimizando su impacto ambiental. Frente a la producción en masa y la pérdida de identidad, el planteamiento informático de problemas globales permite pasar de la producción en masa del siglo pasado a la ‘customización’ en masa al dar respuestas específicas para cada contexto. Por otro lado es necesario que esos procesos computacionales conecten y hagan partícipes del diseño a los diferentes actores sociales implicados. Es por ello que esta investigación se basará en los patrones espaciales de Christopher Alexander y otros modelos algorítmicos de diseño por ordenador puesto que estos describen soluciones paramétricas a conflictos recurrentes de diseño de arquitectura. Su planteamiento permite que cada solución base genere respuestas específicas, a la vez que esta es corregida y optimizada por todos sus utilizadores al poder ser compartida digitalmente. Con ello se busca que el diseño de arquitectura responda a criterios objetivos basados en la experiencia y la crítica participativa y democrática basada en los patrones, de tal modo que los diseños no surjan de un planteamiento top-down impuesto y cerrado, sino que en ellos gane importancia la participación activa de los actores sociales implicados en la definición y uso de los mismos. Por último, esta investigación procura mostrar cómo los patrones pueden jugar un papel determinante en la conceptualización abstracta del diseño, mientras que otros métodos algorítmicos alcanzarán fases del proyecto más concretas. De este modo, los patrones digitales que se pretenden se centran en la customización del diseño, mientras que el uso que le dan otros autores persigue la optimización del mismo. Para ello la investigación recurrirá al análisis de los pabellones de verano de la Serpentine Gallery como casos de estudio en los que comprobar la repercusión de los patrones en el diseño de arquitectura actual y su posible adaptación al diseño paramétrico.
Resumo:
Ambipolar organic field-effect transistors (OFETs), which can efficiently transport both holes and electrons, using a single type of electrode, are currently of great interest due to their possible applications in complementary metal oxide semiconductor (CMOS)-like circuits, sensors, and in light-emitting transistors. Several theoretical and experimental studies have argued that most organic semiconductors should be able to transport both types of carrier, although typically unipolar behavior is observed. One factor that can compromise ambipolar transport in organic semiconductors is poor solid state overlap between the HOMO (p-type) or LUMO (n-type) orbitals of neighboring molecules in the semiconductor thin film. In the search of low-bandgap ambipolar materials, where the absence of skeletal distortions allows closer intermolecular π-π stacking and enhanced intramolecular π-conjugation, a new family of oligothiophene-naphthalimide assemblies have been synthesized and characterized, in which both donor and acceptor moieties are directly conjugated through rigid linkers. In previous works we found that oligothiophene-napthalimide assemblies connected through amidine linkers (NDI derivates) exhibit skeletal distortions (50-60º) arising from steric hindrance between the carbonyl group of the arylene core and the sulphur atom of the neighbored thiophene ring (see Figure 1). In the present work we report novel oligo- and polythiophene–naphthalimide analogues NAI-3T, NAI-5T and poly-NAI-8C-3T, in which the connections of the amidine linkage have been inverted in order to prevent steric interactions. Thus, the nitrogen atoms are directly connected to the naphthalene moiety in NAI derivatives while they were attached directly to the thiophene moiety in the previously investigated NDI-3T and NDI-5T. In Figure 1 is depicted the calculated molecular structure of NAI-3T together with that of NDI-3T showing how the steric interactions are not present in the novel NAI derivative. The planar skeletons in these new family induce higher degree of crystallinity and the carrier charge transport can be switched from n-type to ambipolar behaviour. The highest FET performance is achieved for vapor-deposited films of NAI-3T with mobilities of 1.95x10-4cm2V-1s-1 and 2.00x10-4cm2V-1s-1 for electrons and holes, respectively. Finally, these planar semiconductors are compared with their NDI derivates analogues, which exhibit only n-type mobility, in order to understand the origin of the ambipolarity in this new series of molecular semiconductors.
Resumo:
Bilinear pairings can be used to construct cryptographic systems with very desirable properties. A pairing performs a mapping on members of groups on elliptic and genus 2 hyperelliptic curves to an extension of the finite field on which the curves are defined. The finite fields must, however, be large to ensure adequate security. The complicated group structure of the curves and the expensive field operations result in time consuming computations that are an impediment to the practicality of pairing-based systems. The Tate pairing can be computed efficiently using the ɳT method. Hardware architectures can be used to accelerate the required operations by exploiting the parallelism inherent to the algorithmic and finite field calculations. The Tate pairing can be performed on elliptic curves of characteristic 2 and 3 and on genus 2 hyperelliptic curves of characteristic 2. Curve selection is dependent on several factors including desired computational speed, the area constraints of the target device and the required security level. In this thesis, custom hardware processors for the acceleration of the Tate pairing are presented and implemented on an FPGA. The underlying hardware architectures are designed with care to exploit available parallelism while ensuring resource efficiency. The characteristic 2 elliptic curve processor contains novel units that return a pairing result in a very low number of clock cycles. Despite the more complicated computational algorithm, the speed of the genus 2 processor is comparable. Pairing computation on each of these curves can be appealing in applications with various attributes. A flexible processor that can perform pairing computation on elliptic curves of characteristic 2 and 3 has also been designed. An integrated hardware/software design and verification environment has been developed. This system automates the procedures required for robust processor creation and enables the rapid provision of solutions for a wide range of cryptographic applications.
Resumo:
In this paper, we consider Preference Inference based on a generalised form of Pareto order. Preference Inference aims at reasoning over an incomplete specification of user preferences. We focus on two problems. The Preference Deduction Problem (PDP) asks if another preference statement can be deduced (with certainty) from a set of given preference statements. The Preference Consistency Problem (PCP) asks if a set of given preference statements is consistent, i.e., the statements are not contradicting each other. Here, preference statements are direct comparisons between alternatives (strict and non-strict). It is assumed that a set of evaluation functions is known by which all alternatives can be rated. We consider Pareto models which induce order relations on the set of alternatives in a Pareto manner, i.e., one alternative is preferred to another only if it is preferred on every component of the model. We describe characterisations for deduction and consistency based on an analysis of the set of evaluation functions, and present algorithmic solutions and complexity results for PDP and PCP, based on Pareto models in general and for a special case. Furthermore, a comparison shows that the inference based on Pareto models is less cautious than some other types of well-known preference model.
Resumo:
Sanctum is a public art work by James Coupe and Juan Pampin. It uses the persistent flow of people around the Henry Art Gallery as input, extracting narratives from the demographics of passers-by and the patterns of their movement. The flow of people is used as a physical analogue to another type of crowd, the virtual inhabitants of social networks such as Facebook.
Resumo:
A history of specialties in economics since the late 1950s is constructed on the basis of a large corpus of documents from economics journals. The production of this history relies on a combination of algorithmic methods that avoid subjective assessments of the boundaries of specialties: bibliographic coupling, automated community detection in dynamic networks and text mining. these methods uncover a structuring of economics around recognizable specialties with some significant changes over the time-period covered (1956-2014). Among our results, especially noteworthy are (a) the clearcut existence of 10 families of specialties, (b) the disappearance in the late 1970s of a specialty focused on general economic theory, (c) the dispersal of the econometrics-centered specialty in the early 1990s and the ensuing importance of specific econometric methods for the identity of many specialties since the 1990s, (d) the low level of specialization of individual economists throughout the period in contrast to physicists as early as the late 1960s.
Resumo:
Um semigrupo numérico é um submonoide de (N, +) tal que o seu complementar em N é finito. Neste trabalho estudamos alguns invariantes de um semigrupo numérico S tais como: multiplicidade, dimensão de imersão, número de Frobenius, falhas e conjunto Apéry de S. Caracterizamos uma apresentação minimal para um semigrupo numérico S e descrevemos um método algorítmico para determinar esta apresentação. Definimos um semigrupo numérico irredutível como um semigrupo numérico que não pode ser expresso como intersecção de dois semigrupos numéricos que o contenham propriamente. A finalizar este trabalho, estudamos os semigrupos numéricos irredutíveis e obtemos a decomposição de um semigrupo numérico em irredutíveis. ABSTRACT: A numerical semigroup is a submonoid of (N, +) such that its complement of N is finite. ln this work we study some invariants of a numerical semigroup S such as: multiplicity, embedding dimension, Frobenius number, gaps and Apéry set of S. We characterize a minimal presentation of a numerical semigroup S and describe an algorithmic procedure which allows us to compute a minimal presentation of S. We define an irreducible numerical semigroup as a numerical semigroup that cannot be expressed as the intersection of two numerical semigroups properly containing it. Concluding this work, we study and characterize irreducible numerical semigroups, and describe methods for computing decompositions of a numerical semigroup into irreducible numerical semigroups.
Resumo:
This study presents for the first time the diet of a Late Antiquity population in southern Portugal (Civitas of Pax Julia), from the Roman villa of Monte da Cegonha (predominantly 7th century CE). Stable isotope analysis (δ13C, δ15N, δ18O, 87Sr/86Sr) of human and faunal bone collagen and apatite was conducted in order to understand the influence of Roman subsistence strategies on the way of life of rural inhabitants of the area of Pax Julia and to explore their diet (types of ingested plants, amount of animal resources, terrestrial versus marine resources). X-ray diffraction (XRD) and Fourier transform infra-red spectroscopy (FTIR) analyses were used to determine the degree of bone diagenesis and assess the reliability of the bone stable isotopic composition for palaeodietary reconstruction. Anthropological analysis revealed a cariogenic diet, rich in starchy food and carbohydrates, in at least in two individuals based on the frequency of dental caries. Collagen and apatite carbon isotopic analysis suggested that C3 plants were the basis of the population's diet, complemented with some terrestrial meat and its by-products as reflected by the observed bone collagen nitrogen isotopic composition. Moreover, whilst the fairly low apatite-collagen spacing recorded in some skeletons (at around 4‰) may have been due to freshwater organisms intake, the relatively low nitrogen values observed indicate that this consumption did not occur very often, unless in the form of fresh fish of low trophic level or fish sauces. There were no significant differences in isotopic values depending on gender or burial type. Strontium and oxygen isotopic composition of bone apatite revealed a sedentary community, with the exception of a male individual who probably did not spend his childhood in Monte da Cegonha.
Resumo:
Magnetic Resonance Imaging (MRI) is the in vivo technique most commonly employed to characterize changes in brain structures. The conventional MRI-derived morphological indices are able to capture only partial aspects of brain structural complexity. Fractal geometry and its most popular index, the fractal dimension (FD), can characterize self-similar structures including grey matter (GM) and white matter (WM). Previous literature shows the need for a definition of the so-called fractal scaling window, within which each structure manifests self-similarity. This justifies the existence of fractal properties and confirms Mandelbrot’s assertion that "fractals are not a panacea; they are not everywhere". In this work, we propose a new approach to automatically determine the fractal scaling window, computing two new fractal descriptors, i.e., the minimal and maximal fractal scales (mfs and Mfs). Our method was implemented in a software package, validated on phantoms and applied on large datasets of structural MR images. We demonstrated that the FD is a useful marker of morphological complexity changes that occurred during brain development and aging and, using ultra-high magnetic field (7T) examinations, we showed that the cerebral GM has fractal properties also below the spatial scale of 1 mm. We applied our methodology in two neurological diseases. We observed the reduction of the brain structural complexity in SCA2 patients and, using a machine learning approach, proved that the cerebral WM FD is a consistent feature in predicting cognitive decline in patients with small vessel disease and mild cognitive impairment. Finally, we showed that the FD of the WM skeletons derived from diffusion MRI provides complementary information to those obtained from the FD of the WM general structure in T1-weighted images. In conclusion, the fractal descriptors of structural brain complexity are candidate biomarkers to detect subtle morphological changes during development, aging and in neurological diseases.
Resumo:
We investigated how participants associated with each other and developed community in a Massive Open Online Course (MOOC) about Rhizomatic Learning (Rhizo14). We compared learner experiences in two social networking sites (SNSs), Facebook and Twitter. Our combination of thematic analysis of qualitative survey data with analysis of participant observation, activity data, archives and visualisation of SNS data enabled us to reach a deeper understanding of participant perspectives and explore SNS use. Community was present in the course title and understood differently by participants. In the absence of explanation or discussion about community early in the MOOC, a controversy between participants about course expectations emerged that created oppositional discourse. Fall off in activity in MOOCs is common and was evident in Rhizo14. As the course progressed, fewer participants were active in Facebook and some participants reported feelings of exclusion. Despite this, activity in Facebook increased overall. The top 10 most active participants were responsible for 47% of total activity. In the Rhizo14 MOOC, both community and curriculum were expected to emerge within the course. We suggest that there are tensions and even contradictions between ‘Community Is the Curriculum’ and Deleuze and Guattari's principles of the rhizome, mainly focussed on an absence of heterogeneity. These tensions may be exacerbated by SNSs that use algorithmic streams. We propose the use of networking approaches that enable negotiation and exchange to encourage heterogeneity rather than emergent definition of community.