964 resultados para brain-computer interface
Resumo:
Brain dopamine transporters imaging by Single Emission Tomography (SPECT) with 123I-FP-CIT (DaTScanTM) has become an important tool in the diagnosis and evaluation of Parkinson syndromes.This diagnostic method allows the visualization of a portion of the striatum – where healthy pattern resemble two symmetric commas - allowing the evaluation of dopamine presynaptic system, in which dopamine transporters are responsible for dopamine release into the synaptic cleft, and their reabsorption into the nigrostriatal nerve terminals, in order to be stored or degraded. In daily practice for assessment of DaTScan TM, it is common to rely only on visual assessment for diagnosis. However, this process is complex and subjective as it depends on the observer’s experience and it is associated with high variability intra and inter observer. Studies have shown that semiquantification can improve the diagnosis of Parkinson syndromes. For semiquantification, analysis methods of image segmentation using regions of interest (ROI) are necessary. ROIs are drawn, in specific - striatum - and in nonspecific – background – uptake areas. Subsequently, specific binding ratios are calculated. Low adherence of semiquantification for diagnosis of Parkinson syndromes is related, not only with the associated time spent, but also with the need of an adapted database of reference values for the population concerned, as well as, the examination of each service protocol. Studies have concluded, that this process increases the reproducibility of semiquantification. The aim of this investigation was to create and validate a database of healthy controls for Dopamine transporters with DaTScanTM named DBRV. The created database has been adapted to the Nuclear Medicine Department’s protocol, and the population of Infanta Cristina’s Hospital located in Badajoz, Spain.
Resumo:
Brain dopamine transporters imaging by Single Photon Emission Tomography (SPECT) with 123I-FP-CIT has become an important tool in the diagnosis and evaluation of parkinsonian syndromes, since this radiopharmaceutical exhibits high affinity for membrane transporters responsible for cellular reabsorption of dopamine on the striatum. However, Ordered Subset Expectation Maximization (OSEM) is the method recommended in the literature for imaging reconstruction. Filtered Back Projection (FBP) is still used due to its fast processing, even if it presents some disadvantages. The aim of this work is to investigate the influence of reconstruction parameters for FBP in semiquantification of Brain Studies with 123I-FPCIT compared with those obtained with OSEM recommended reconstruction.
Resumo:
Esta dissertação enquadra-se no âmbito dos Sistemas de Informação, em concreto, no desenvolvimento de aplicações Web, como é o caso de um website. Com a utilização em larga escala dos meios tecnológicos tem-se verificado um crescimento exponencial dos mesmos, o que se traduz na facilidade com que podem ser encontradas na Internet diversos tipos de plataformas informáticas. Além disso, hoje em dia, uma grande parte das organizações possui o seu próprio sítio na Internet, onde procede à divulgação dos seus serviços e/ou produtos. Pretende-se com esta dissertação explorar estas novas tecnologias, nomeadamente, os diagramas UML - Unified Modeling Language e a concepção de bases de dados, e posteriormente desenvolver um website. Com o desenvolvimento deste website não se propõe a criação de uma nova tecnologia, mas o uso de diversas tecnologias em conjunto com recurso às ferramentas UML. Este encontra-se organizado em três fases principais: análise de requisitos, implementação e desenho das interfaces. Na análise de requisitos efectuou-se o levantamento dos objectivos propostos para o sistema e das necessidades/requisitos necessários à sua implementação, auxiliado essencialmente pelo Diagrama de Use Cases do sistema. Na fase de implementação foram elaborados os arquivos e directórios que formam a arquitectura lógica de acordo com os modelos descritos no Diagrama de Classes e no Diagrama de Entidade-Relação. Os requisitos identificados foram analisados e usados na composição das interfaces e sistema de navegação. Por fim, na fase de desenho das interfaces foram aperfeiçoadas as interfaces desenvolvidas, com base no conceito artístico e criativo do autor. Este aperfeiçoamento vai de encontro ao gosto pessoal e tem como objectivo elaborar uma interface que possa também agradar ao maior número possível de utilizadores. Este pode ser observado na maneira como se encontram distribuídas as ligações (links) entre páginas, nos títulos, nos cabeçalhos, nas cores e animações e no seu design em geral. Para o desenvolvimento do website foram utilizadas diferentes linguagens de programação, nomeadamente a HyperText Markup Language (HTML), a Page Hypertext Preprocessor (PHP) e Javascript. A HTML foi utilizada para a disposição de todo o conteúdo visível das páginas e para definição do layout das mesmas e a PHP para executar pequenos scripts que permitem interagir com as diferentes funcionalidades do site. A linguagem Javascript foi usada para definir o design das páginas e incluir alguns efeitos visuais nas mesmas. Para a construção das páginas que compõem o website foi utilizado o software Macromedia Dreamweaver, o que simplificou a sua implementação pela facilidade com que estas podem ser construídas. Para interacção com o sistema de gestão da base de dados, o MySQL, foi utilizada a aplicação phpMyAdmin, que simplifica o acesso à base de dados, permitindo definir, manipular e consultar os seus dados.
Resumo:
São vários os factores sociais e económicos que valorizam a aplicação de tecnologias de domótica em edifícios. No caso particular dos edifícios residenciais, a tendência dos seus utilizadores é a instalação de sistemas de controlo da segurança, do ambiente, de mecanismos de rega e de alarmes. Assim, seguindo a premissa do marketing, que identifica como uma boa prática a projecção de produtos / serviços que satisfaçam as necessidades inventariadas pelos seus utilizadores, este trabalho assenta na criação de um sistema domótico, controlado remotamente através de uma aplicação Android, que pretende, numa primeira instância, o controlo das lâmpadas de uma habitação. Neste trabalho é utilizado o protocolo KNX.TP para a comunicação dos dispositivos de domótica existentes no ISEP, que constituem o ambiente domótico deste trabalho. De forma a implementar o controlo remoto destes dispositivos via internet, este trabalho foca-se no desenvolvimento de uma interface IP-KNX, usando como hardware de controlo, um Arduino Mega 2560, uma placa de interface Ethernet para Arduino, a placa de integração KNX, e um servidor web com a linguagem PHP instalada. Para efeitos de demonstração, foi criada uma aplicação para o SO Android que controla as lâmpadas da rede KNX. Neste trabalho foram utilizadas várias linguagens de programação: C++ no firmware do Arduino, PHP no servidor web e JAVA + XML na aplicação Android.
Resumo:
Thesis to obtain the Master of Science Degree in Computer Science and Engineering
Resumo:
Trabalho de projeto apresentado à Escola Superior de Comunicação Social como parte dos requisitos para obtenção de grau de mestre em Audiovisual e Multimédia.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologiea da Universidade Nova de Lisboa, para obtenção do Grau de Mestre em Engenharia Biomédica
Resumo:
Environmental management is a complex task. The amount and heterogeneity of the data needed for an environmental decision making tool is overwhelming without adequate database systems and innovative methodologies. As far as data management, data interaction and data processing is concerned we here propose the use of a Geographical Information System (GIS) whilst for the decision making we suggest a Multi-Agent System (MAS) architecture. With the adoption of a GIS we hope to provide a complementary coexistence between heterogeneous data sets, a correct data structure, a good storage capacity and a friendly user’s interface. By choosing a distributed architecture such as a Multi-Agent System, where each agent is a semi-autonomous Expert System with the necessary skills to cooperate with the others in order to solve a given task, we hope to ensure a dynamic problem decomposition and to achieve a better performance compared with standard monolithical architectures. Finally, and in view of the partial, imprecise, and ever changing character of information available for decision making, Belief Revision capabilities are added to the system. Our aim is to present and discuss an intelligent environmental management system capable of suggesting the more appropriate land-use actions based on the existing spatial and non-spatial constraints.
Resumo:
This article discusses the development of an Intelligent Distributed Environmental Decision Support System, built upon the association of a Multi-agent Belief Revision System with a Geographical Information System (GIS). The inherent multidisciplinary features of the involved expertises in the field of environmental management, the need to define clear policies that allow the synthesis of divergent perspectives, its systematic application, and the reduction of the costs and time that result from this integration, are the main reasons that motivate the proposal of this project. This paper is organised in two parts: in the first part we present and discuss the developed Distributed Belief Revision Test-bed — DiBeRT; in the second part we analyse its application to the environmental decision support domain, with special emphasis on the interface with a GIS.
Resumo:
Submitted in part fulfillment of the requirements for the degree of Master in Computer Science
Resumo:
This paper proposes and reports the development of an open source solution for the integrated management of Infrastructure as a Service (IaaS) cloud computing resources, through the use of a common API taxonomy, to incorporate open source and proprietary platforms. This research included two surveys on open source IaaS platforms (OpenNebula, OpenStack and CloudStack) and a proprietary platform (Parallels Automation for Cloud Infrastructure - PACI) as well as on IaaS abstraction solutions (jClouds, Libcloud and Deltacloud), followed by a thorough comparison to determine the best approach. The adopted implementation reuses the Apache Deltacloud open source abstraction framework, which relies on the development of software driver modules to interface with different IaaS platforms, and involved the development of a new Deltacloud driver for PACI. The resulting interoperable solution successfully incorporates OpenNebula, OpenStack (reuses pre-existing drivers) and PACI (includes the developed Deltacloud PACI driver) nodes and provides a Web dashboard and a Representational State Transfer (REST) interface library. The results of the exchanged data payload and time response tests performed are presented and discussed. The conclusions show that open source abstraction tools like Deltacloud allow the modular and integrated management of IaaS platforms (open source and proprietary), introduce relevant time and negligible data overheads and, as a result, can be adopted by Small and Medium-sized Enterprise (SME) cloud providers to circumvent the vendor lock-in problem whenever service response time is not critical.
Resumo:
We propose a fractional model for computer virus propagation. The model includes the interaction between computers and removable devices. We simulate numerically the model for distinct values of the order of the fractional derivative and for two sets of initial conditions adopted in the literature. We conclude that fractional order systems reveal richer dynamics than the classical integer order counterpart. Therefore, fractional dynamics leads to time responses with super-fast transients and super-slow evolutions towards the steady-state, effects not easily captured by the integer order models.
Resumo:
Composition is a practice of key importance in software engineering. When real-time applications are composed, it is necessary that their timing properties (such as meeting the deadlines) are guaranteed. The composition is performed by establishing an interface between the application and the physical platform. Such an interface typically contains information about the amount of computing capacity needed by the application. For multiprocessor platforms, the interface should also present information about the degree of parallelism. Several interface proposals have recently been put forward in various research works. However, those interfaces are either too complex to be handled or too pessimistic. In this paper we propose the generalized multiprocessor periodic resource model (GMPR) that is strictly superior to the MPR model without requiring a too detailed description. We then derive a method to compute the interface from the application specification. This method has been implemented in Matlab routines that are publicly available.
Resumo:
Thesis presented in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the subject of Electrical and Computer Engineering
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores - Área de Especialização em Automação e Sistemas