908 resultados para Task-to-core mapping
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
A tese desenvolvida tem como foco fornecer os meios necessários para extrair conhecimento contidos no histórico académico da instituição transformando a informação em algo simples e de fácil leitura para qualquer utilizador. Com o progresso da sociedade, as escolas recebem milhares de alunos todos os anos que terão de ser orientados e monitorizados pelos dirigentes das instituições académicas de forma a garantir programas eficientes e adequados para o progresso educacional de todos os alunos. Atribuir a um docente a responsabilidade de actuar segundo o historial académico dos seus alunos não é plausível uma vez que um aluno consegue produzir milhares de registos para análise. O paradigma de mineração de dados na educação surge com a necessidade de otimizar os recursos disponíveis expondo conclusões que não se encontram visiveis sem uma análise acentuada e cuidada. Este paradigma expõe de forma clara e sucinta os dados estatísticos analisados por computador oferecendo a possibilidade de melhorar as lacunas na qualidade de ensino das instituições. Esta dissertação detalha o desenvolvimento de uma ferramente de inteligência de negócio capaz de, através de mineração de dados, analisar e apresentar conclusões pertinentes de forma legível ao utilizador.
Resumo:
This dissertation presents an approach aimed at three-dimensional perception’s obstacle detection on all-terrain robots. Given the huge amount of acquired information, the adversities such environments present to an autonomous system and the swiftness, thus required, from each of its navigation decisions, it becomes imperative that the 3-D perceptional system to be able to map obstacles and passageways in the most swift and detailed manner. In this document, a hybrid approach is presented bringing the best of several methods together, combining the lightness of lesser meticulous analyses with the detail brought by more thorough ones. Realizing the former, a terrain’s slope mapping system upon a low resolute volumetric representation of the surrounding occupancy. For the latter’s detailed evaluation, two novel metrics were conceived to discriminate the little depth discrepancies found in between range scanner’s beam distance measurements. The hybrid solution resulting from the conjunction of these two representations provides a reliable answer to traversability mapping and a robust discrimination of penetrable vegetation from that constituting real obstructions. Two distinct robotic platforms offered the possibility to test the hybrid approach on very different applications: a boat, under an European project, the ECHORD Riverwatch, and a terrestrial four-wheeled robot for a national project, the Introsys Robot.
Resumo:
The study of the effect of radiation on living tissues is a rather complex task to address mainly because they are made of a set of complex functional biological structures and interfaces. Particularly if one is looking for where damage is taking place in a first stage and what are the underlying reaction mechanisms. In this work a new approach is addressed to study the effect of radiation by making use of well identified molecular hetero-structures samples which mimic the biological environment. These were obtained by assembling onto a solid support deoxyribonucleic acid (DNA) and phospholipids together with a soft water-containing polyelectrolyte precursor in layered structures and by producing lipid layers at liquid/air interface with DNA as subphase. The effects of both ultraviolet (UV) radiation and carbon ions beams were systematically investigated in these heterostructures, namely damage on DNA by means vacuum ultraviolet (VUV), infrared (IR), X-Ray Photoelectron (XPS) and impedance spectroscopy. Experimental results revealed that UV affects furanose, PO2-, thymines, cytosines and adenines groups. The XPS spectrometry carried out on the samples allowed validate the VUV and IR results and to conclude that ionized phosphate groups, surrounded by the sodium counterions, congregate hydration water molecules which play a role of UV protection. The ac electrical conductivity measurements revealed that the DNA electrical conduction is arising from DNA chain electron hopping between base-pairs and phosphate groups, with the hopping distance equal to the distance between DNA base-pairs and is strongly dependent on UV radiation exposure, due loss of phosphate groups. Characterization of DNA samples exposed to a 4 keV C3+ ions beam revealed also carbon-oxygen bonds break, phosphate groups damage and formation of new species. Results from radiation induced damage carried out on biomimetic heterostructures having different compositions revealed that damage is dependent on sample composition, with respect to functional targeted groups and extent of damage. Conversely, LbL films of 1,2-dipalmitoyl-sn-Glycero-3-[Phospho-rac-(1-glycerol)] (Sodium Salt) (DPPG) liposomes, alternated with poly(allylamine hydrochloride) (PAH) revealed to be unaffected, even by prolonged UV irradiation exposure, in the absence of water molecules. However, DPPG molecules were damaged by the UV radiation in presence of water with cleavage of C-O, C=O and –PO2- bonds. Finally, the study of DNA interaction with the ionic lipids at liquid/air interfaces revealed that electrical charge of the lipid influences the interaction of phospholipid with DNA. In the presence of DNA in the subphase, the effects from UV irrladiation were seen to be smaller, which means that ionic products from biomolecules degradation stabilize the intact DPPG molecules. This mechanism may explain why UV irradiation does not cause immediate cell collapse, thus providing time for the cellular machinery to repair elements damaged by UV.
Resumo:
It is a difficult task to avoid the “smart systems” topic when discussing smart prevention and, similarly, it is a difficult task to address smart systems without focusing their ability to learn. Following the same line of thought, in the current reality, it seems a Herculean task (or an irreparable omission) to approach the topic of certified occupational health and safety management systems (OHSMS) without discussing the integrated management systems (IMSs). The available data suggest that seldom are the OHSMS operating as the single management system (MS) in a company so, any statement concerning OHSMS should mainly be interpreted from an integrated perspective. A major distinction between generic systems can be drawn between those that learn, i.e., those systems that have “memory” and those that have not. These former systems are often depicted as adaptive since they take into account past events to deal with novel, similar and future events modifying their structure to enable success in its environment. Often, these systems, present a nonlinear behavior and a huge uncertainty related to the forecasting of some events. This paper seeks to portray, for the first time as we were able to find out, the IMSs as complex adaptive systems (CASs) by listing their properties and dissecting the features that enable them to evolve and self-organize in order to, holistically, fulfil the requirements from different stakeholders and thus thrive by assuring the successful sustainability of a company. Based on the revision of literature carried out, this is the first time that IMSs are pointed out as CASs which may develop fruitful synergies both for the MSs and for CASs communities. By performing a thorough revision of literature and based on some concepts embedded in the “DNA” of the subsystems implementation standards it is intended, specifically, to identify, determine and discuss the properties of a generic IMS that should be considered to classify it as a CAS.
Resumo:
Dissertação de mestrado integrado em Psicologia
Resumo:
The modern computer systems that are in use nowadays are mostly processor-dominant, which means that their memory is treated as a slave element that has one major task – to serve execution units data requirements. This organization is based on the classical Von Neumann's computer model, proposed seven decades ago in the 1950ties. This model suffers from a substantial processor-memory bottleneck, because of the huge disparity between the processor and memory working speeds. In order to solve this problem, in this paper we propose a novel architecture and organization of processors and computers that attempts to provide stronger match between the processing and memory elements in the system. The proposed model utilizes a memory-centric architecture, wherein the execution hardware is added to the memory code blocks, allowing them to perform instructions scheduling and execution, management of data requests and responses, and direct communication with the data memory blocks without using registers. This organization allows concurrent execution of all threads, processes or program segments that fit in the memory at a given time. Therefore, in this paper we describe several possibilities for organizing the proposed memory-centric system with multiple data and logicmemory merged blocks, by utilizing a high-speed interconnection switching network.
Resumo:
The algorithmic approach to data modelling has developed rapidly these last years, in particular methods based on data mining and machine learning have been used in a growing number of applications. These methods follow a data-driven methodology, aiming at providing the best possible generalization and predictive abilities instead of concentrating on the properties of the data model. One of the most successful groups of such methods is known as Support Vector algorithms. Following the fruitful developments in applying Support Vector algorithms to spatial data, this paper introduces a new extension of the traditional support vector regression (SVR) algorithm. This extension allows for the simultaneous modelling of environmental data at several spatial scales. The joint influence of environmental processes presenting different patterns at different scales is here learned automatically from data, providing the optimum mixture of short and large-scale models. The method is adaptive to the spatial scale of the data. With this advantage, it can provide efficient means to model local anomalies that may typically arise in situations at an early phase of an environmental emergency. However, the proposed approach still requires some prior knowledge on the possible existence of such short-scale patterns. This is a possible limitation of the method for its implementation in early warning systems. The purpose of this paper is to present the multi-scale SVR model and to illustrate its use with an application to the mapping of Cs137 activity given the measurements taken in the region of Briansk following the Chernobyl accident.
Resumo:
Using event-related potentials (ERPs), we investigated the neural response associated with preparing to switch from one task to another. We used a cued task-switching paradigm in which the interval between the cue and the imperative stimulus was varied. The difference between response time (RT) to trials on which the task switched and trials on which the task repeated (switch cost) decreased as the interval between cue and target (CTI) was increased, demonstrating that subjects used the CTI to prepare for the forthcoming task. However, the RT on repeated-task trials in blocks during which the task could switch (mixed-task blocks) were never as short as RTs during single-task blocks (mixing cost). This replicates previous research. The ERPs in response to the cue were compared across three conditions: single-task trials, switch trials, and repeat trials. ERP topographic differences were found between single-task trials and mixed-task (switch and repeat) trials at approximately 160 and approximately 310 msec after the cue, indicative of changes in the underlying neural generator configuration as a basis for the mixing cost. In contrast, there were no topographic differences evident between switch and repeat trials during the CTI. Rather, the response of statistically indistinguishable generator configurations was stronger at approximately 310 msec on switch than on repeat trials. By separating differences in ERP topography from differences in response strength, these results suggest that a reappraisal of previous research is appropriate.
Resumo:
Random single pass sequencing of cDNA fragments, also known as generation of Expressed Sequence Tags (ESTs), has been highly successful in the study of the gene content of higher organisms, and forms an integral part of most genome projects, with the objective to identify new genes and targets for disease control and prevention and to generate mapping probes. In the Trypanosoma cruzi genome project, EST sequencing has also been a starting point, and here we report data on the first 797 sequences obtained, partly from a CL Brener epimastigote non-normalized library, partly on a normalized library. Only around 30% of the sequences obtained showed similarity with Genbank and dbEST databases, half of which with sequences already reported for T. cruzi.
Resumo:
El proyecto está basado en la creación de una herramienta para poder gestionar las relaciones que el Centro Metalúrgico de Sabadell tiene con sus clientes. La tarea a desarrollar es un CRM (Custom Relationship Management) para gestionar todos los servicios y ofertas que ofrece el centro con sus clientes, desde cursos impartidos u organizados, consultas que tienen los empleados de los socios, asesoramientos, acciones de los clientes pero también información útil para conocer el estado de acciones comerciales con un cliente, contactos de un socio, así como sus empleados.
Resumo:
Introduction. In autism and schizophrenia attenuated/atypical functional hemispheric asymmetry and theory of mind impairments have been reported, suggesting common underlying neuroscientific correlates. We here investigated whether impaired theory of mind performance is associated with attenuated/atypical hemispheric asymmetry. An association may explain the co-occurrence of both dysfunctions in psychiatric populations. Methods. Healthy participants (n 129) performed a left hemisphere (lateralised lexical decision task) and right hemisphere (lateralised face decision task) dominant task as well as a visual cartoon task to assess theory of mind performance. Results. Linear regression analyses revealed inconsistent associations between theory of mind performance and functional hemisphere asymmetry: enhanced theory of mind performance was only associated with (1) faster right hemisphere language processing, and (2) reduced right hemisphere dominance for face processing (men only). Conclusions. The majority of non-significant findings suggest that theory of mind and functional hemispheric asymmetry are unrelated. Instead of ''overinterpreting'' the two significant results, discrepancies in the previous literature relating to the problem of the theory of mind concept, the variety of tasks, and the lack of normative data are discussed. We also suggest how future studies could explore a possible link between hemispheric asymmetry and theory of mind.
Resumo:
This first annual report of the Director of Public Health highlights the many public health challenges that affect people in Northern Ireland and how thepublic health team tackles this complex agenda by working with many statutory, community and voluntary partner organisations across health, local government, education, housing and other sectors. The report refers to core tables throughout, these tables provide key statistical data on population, birth and death rates, mortality by cause, life expectancy, immunisation and screening. The report and the core tables are available below.
Resumo:
This report provides our advice to the Minister for Education and Science on the application for designation as a university made by Waterford Institute of Technology (WIT). WIT submitted an application for designation in February 2006. There is a statutory procedure for the creation of a new university under Section 9 of the Universities Act 1997. We were asked to advise the Minister on the merits of the submission in order for her to provide guidance to Government on whether such a formal statutory review should be initiated. It is not a straightforward task to advise on this case for several reasons. These include the facts that: the regulatory environment for Institutes of Technology has changed significantly since WIT made their application; and the designation of any IoT would potentially challenge the government’s current higher education policy. So our report has to range more widely than the merits of the WIT application, taken at face value.
Resumo:
We want to shed some light on the development of person mobility by analysing the repeated cross-sectional data of the four National Travel Surveys (NTS) that were conducted in Germany since the mid seventies. The above mentioned driving forces operate on different levels of the system that generates the spatial behaviour we observe: Travel demand is derived from the needs and desires of individuals to participate in spatially separated activities. Individuals organise their lives in an interactive process within the context they live in, using given infrastructure. Essential determinants of their demand are the individual's socio-demographic characteristics, but also the opportunities and constraints defined by the household and the environment are relevant for the behaviour which ultimately can be realised. In order to fully capture the context which determines individual behaviour, the (nested) hierarchy of persons within households within spatial settings has to be considered. The data we will use for our analysis contains information on these three levels. With the analysis of this micro-data we attempt to improve our understanding of the afore subsumed macro developments. In addition we will investigate the prediction power of a few classic sociodemographic variables for the daily travel distance of individuals in the four NTS data sets, with a focus on the evolution of this predictive power. The additional task to correctly measure distances travelled by means of the NTS is threatened by the fact that although these surveys measure the same variables, different sampling designs and data collection procedures were used. So the aim of the analysis is also to detect variables whose control corrects for the known measurement error, as a prerequisite to apply appropriate models in order to better understand the development of individual travel behaviour in a multilevel context. This task is complicated by the fact that variables that inform on survey procedures and outcomes are only provided with the data set for 2002 (see Infas and DIW Berlin, 2003).