945 resultados para User-Computer Interface
Resumo:
The teaching/learning activities of the daylighting built environment require from the Architecture and Urbanism undergraduate student the ability to abstract the effects of daylight distributed in three-dimensional space that is being designed. Several tools and techniques can be used to facilitate the understanding of the involved phenomena, among which the computational simulation. This paper reports the digital inclusion of the daylighting teaching in the Architecture and Urbanism undergraduate course at the School of Architecture, Arts and Social Communication of Bauru (FAAC) of UNESP – Sao Paulo State University, that began in 2010. The inclusion process involved free software use, specifically the programs DIALux and SketchUp+Radiance, both with graphical output for the illuminated scenes visualization and for result analysis. The graphic model is converted from SketchUp to Radiance by a plugin and a user-friendly interface for Windows was developed to simulate the lighting. The process of digital inclusion is consolidated, with wide acceptance by students, for which computational simulation facilitates understanding of relation between daylight and built environment and helps the design process of elements for daylighting control.
Resumo:
The elimination of all external incisions is an important step in reducing the invasiveness of surgical procedures. Natural Orifice Translumenal Endoscopic Surgery (NOTES) is an incision-less surgery and provides explicit benefits such as reducing patient trauma and shortening recovery time. However, technological difficulties impede the widespread utilization of the NOTES method. A novel robotic tool has been developed, which makes NOTES procedures feasible by using multiple interchangeable tool tips. The robotic tool has the capability of entering the body cavity through an orifice or a single incision using a flexible articulated positioning mechanism and once inserted is not constrained by incisions, allowing for visualization and manipulations throughout the cavity. Multiple interchangeable tool tips of the robotic device initially consist of three end effectors: a grasper, scissors, and an atraumatic Babcock clamp. The tool changer is capable of selecting and switching between the three tools depending on the surgical task using a miniature mechanism driven by micro-motors. The robotic tool is remotely controlled through a joystick and computer interface. In this thesis, the following aspects of this robotic tool will be detailed. The first-generation robot is designed as a conceptual model for implementing a novel mechanism of switching, advancing, and controlling the tool tips using two micro-motors. It is believed that this mechanism achieves a reduction in cumbersome instrument exchanges and can reduce overall procedure time and the risk of inadvertent tissue trauma during exchanges with a natural orifice approach. Also, placing actuators directly at the surgical site enables the robot to generate sufficient force to operate effectively. Mounting the multifunctional robot on the distal end of an articulating tube provides freedom from restriction on the robot kinematics and helps solve some of the difficulties otherwise faced during surgery using NOTES or related approaches. The second-generation multifunctional robot is then introduced in which the overall size is reduced and two arms provide 2 additional degrees of freedom, resulting in feasibility of insertion through the esophagus and increased dexterity. Improvements are necessary in future iterations of the multifunctional robot; however, the work presented is a proof of concept for NOTES robots capable of abdominal surgical interventions.
Resumo:
The Xylella fastidiosa comparative genomic database is a scientific resource with the aim to provide a user-friendly interface for accessing high-quality manually curated genomic annotation and comparative sequence analysis, as well as for identifying and mapping prophage-like elements, a marked feature of Xylella genomes. Here we describe a database and tools for exploring the biology of this important plant pathogen. The hallmarks of this database are the high quality genomic annotation, the functional and comparative genomic analysis and the identification and mapping of prophage-like elements. It is available from web site http://www.xylella.lncc.br.
Resumo:
Nel presente lavoro di tesi è stato sviluppato e testato un sistema BCI EEG-based che sfrutta la modulazione dei ritmi sensorimotori tramite immaginazione motoria della mano destra e della mano sinistra. Per migliorare la separabilità dei due stati mentali, in questo lavoro di tesi si è sfruttato l'algoritmo CSP (Common Spatial Pattern), in combinazione ad un classificatore lineare SVM. I due stati mentali richiesti sono stati impiegati per controllare il movimento (rotazione) di un modello di arto superiore a 1 grado di libertà, simulato sullo schermo. Il cuore del lavoro di tesi è consistito nello sviluppo del software del sistema BCI (basato su piattaforma LabVIEW 2011), descritto nella tesi. L'intero sistema è stato poi anche testato su 4 soggetti, per 6 sessioni di addestramento.
Resumo:
Ogni anno si registra un crescente aumento delle persone affette da patologie neurodegenerative come la sclerosi laterale amiotrofica, la sclerosi multipla, la malattia di Parkinson e persone soggette a gravi disabilità motorie dovute ad ictus, paralisi cerebrale o lesioni al midollo spinale. Spesso tali condizioni comportano menomazioni molto invalidanti e permanenti delle vie nervose, deputate al controllo dei muscoli coinvolti nell’esecuzione volontaria delle azioni. Negli ultimi anni, molti gruppi di ricerca si sono interessati allo sviluppo di sistemi in grado di soddisfare le volontà dell’utente. Tali sistemi sono generalmente definiti interfacce neurali e non sono pensati per funzionare autonomamente ma per interagire con il soggetto. Tali tecnologie, note anche come Brain Computer Interface (BCI), consentono una comunicazione diretta tra il cervello ed un’apparecchiatura esterna, basata generalmente sull’elettroencefalografia (EEG), in grado di far comunicare il sistema nervoso centrale con una periferica esterna. Tali strumenti non impiegano le usuali vie efferenti coinvolte nella produzione di azioni quali nervi e muscoli, ma collegano l'attività cerebrale ad un computer che ne registra ed interpreta le variazioni, permettendo quindi di ripristinare in modo alternativo i collegamenti danneggiati e recuperare, almeno in parte, le funzioni perse. I risultati di numerosi studi dimostrano che i sistemi BCI possono consentire alle persone con gravi disabilità motorie di condividere le loro intenzioni con il mondo circostante e provano perciò il ruolo importante che esse sono in grado di svolgere in alcune fasi della loro vita.
Resumo:
Il funzionamento del cervello umano, organo responsabile di ogni nostra azione e pensiero, è sempre stato di grande interesse per la ricerca scientifica. Dopo aver compreso lo sviluppo dei potenziali elettrici da parte di nuclei neuronali in risposta a stimoli, si è riusciti a graficare il loro andamento con l'avvento dell'ElettroEncefaloGrafia (EEG). Tale tecnologia è entrata a far parte degli esami di routine per la ricerca di neuropsicologia e di interesse clinico, poiché permette di diagnosticare e discriminare i vari tipi di epilessia, la presenza di traumi cranici e altre patologie del sistema nervoso centrale. Purtroppo presenta svariati difetti: il segnale è affetto da disturbi e richiede un'adeguata elaborazione tramite filtraggio e amplificazione, rimanendo comunque sensibile a disomogeneità dei tessuti biologici e rendendo difficoltoso il riconoscimento delle sorgenti del segnale che si sono attivate durante l'esame (il cosiddetto problema inverso). Negli ultimi decenni la ricerca ha portato allo sviluppo di nuove tecniche d'indagine, di particolare interesse sono la ElettroEncefaloGrafia ad Alta Risoluzione (HREEG) e la MagnetoEncefaloGrafia (MEG). L'HREEG impiega un maggior numero di elettrodi (fino a 256) e l'appoggio di accurati modelli matematici per approssimare la distribuzione di potenziale elettrico sulla cute del soggetto, garantendo una migliore risoluzione spaziale e maggior sicurezza nel riscontro delle sorgenti neuronali. Il progresso nel campo dei superconduttori ha reso possibile lo sviluppo della MEG, che è in grado di registrare i deboli campi magnetici prodotti dai segnali elettrici corticali, dando informazioni immuni dalle disomogeneità dei tessuti e andando ad affiancare l'EEG nella ricerca scientifica. Queste nuove tecnologie hanno aperto nuovi campi di sviluppo, più importante la possibilità di comandare protesi e dispositivi tramite sforzo mentale (Brain Computer Interface). Il futuro lascia ben sperare per ulteriori innovazioni.
Resumo:
Pragmatism is the leading motivation of regularization. We can understand regularization as a modification of the maximum-likelihood estimator so that a reasonable answer could be given in an unstable or ill-posed situation. To mention some typical examples, this happens when fitting parametric or non-parametric models with more parameters than data or when estimating large covariance matrices. Regularization is usually used, in addition, to improve the bias-variance tradeoff of an estimation. Then, the definition of regularization is quite general, and, although the introduction of a penalty is probably the most popular type, it is just one out of multiple forms of regularization. In this dissertation, we focus on the applications of regularization for obtaining sparse or parsimonious representations, where only a subset of the inputs is used. A particular form of regularization, L1-regularization, plays a key role for reaching sparsity. Most of the contributions presented here revolve around L1-regularization, although other forms of regularization are explored (also pursuing sparsity in some sense). In addition to present a compact review of L1-regularization and its applications in statistical and machine learning, we devise methodology for regression, supervised classification and structure induction of graphical models. Within the regression paradigm, we focus on kernel smoothing learning, proposing techniques for kernel design that are suitable for high dimensional settings and sparse regression functions. We also present an application of regularized regression techniques for modeling the response of biological neurons. Supervised classification advances deal, on the one hand, with the application of regularization for obtaining a na¨ıve Bayes classifier and, on the other hand, with a novel algorithm for brain-computer interface design that uses group regularization in an efficient manner. Finally, we present a heuristic for inducing structures of Gaussian Bayesian networks using L1-regularization as a filter. El pragmatismo es la principal motivación de la regularización. Podemos entender la regularización como una modificación del estimador de máxima verosimilitud, de tal manera que se pueda dar una respuesta cuando la configuración del problema es inestable. A modo de ejemplo, podemos mencionar el ajuste de modelos paramétricos o no paramétricos cuando hay más parámetros que casos en el conjunto de datos, o la estimación de grandes matrices de covarianzas. Se suele recurrir a la regularización, además, para mejorar el compromiso sesgo-varianza en una estimación. Por tanto, la definición de regularización es muy general y, aunque la introducción de una función de penalización es probablemente el método más popular, éste es sólo uno de entre varias posibilidades. En esta tesis se ha trabajado en aplicaciones de regularización para obtener representaciones dispersas, donde sólo se usa un subconjunto de las entradas. En particular, la regularización L1 juega un papel clave en la búsqueda de dicha dispersión. La mayor parte de las contribuciones presentadas en la tesis giran alrededor de la regularización L1, aunque también se exploran otras formas de regularización (que igualmente persiguen un modelo disperso). Además de presentar una revisión de la regularización L1 y sus aplicaciones en estadística y aprendizaje de máquina, se ha desarrollado metodología para regresión, clasificación supervisada y aprendizaje de estructura en modelos gráficos. Dentro de la regresión, se ha trabajado principalmente en métodos de regresión local, proponiendo técnicas de diseño del kernel que sean adecuadas a configuraciones de alta dimensionalidad y funciones de regresión dispersas. También se presenta una aplicación de las técnicas de regresión regularizada para modelar la respuesta de neuronas reales. Los avances en clasificación supervisada tratan, por una parte, con el uso de regularización para obtener un clasificador naive Bayes y, por otra parte, con el desarrollo de un algoritmo que usa regularización por grupos de una manera eficiente y que se ha aplicado al diseño de interfaces cerebromáquina. Finalmente, se presenta una heurística para inducir la estructura de redes Bayesianas Gaussianas usando regularización L1 a modo de filtro.
Resumo:
Este informe trata el diseño, desarrollo y construcción de un aerodeslizador de pequeño tamaño, equipado con control remoto que permite al usuario actuar sobre la velocidad y dirección del mismo. Este proyecto podrá ser utilizado en un futuro como base para el desarrollo de aplicaciones más complejas. Un aerodeslizador es un medio de transporte cuyo chasis se eleva sobre el suelo por medio de un motor impulsor que hincha una falda colocada en la parte inferior del mismo. Además, uno o más motores se colocan en la parte trasera del vehículo para propulsarlo. El hecho de que el aerodeslizador no este en contacto directo con la tierra, hace que pueda moverse tanto por tierra como sobre el agua o hielo y que sea capaz de superar pequeños obstáculos. Por otra parte, este hecho se convierte a su vez en un problema debido a que su fuerza de rozamiento al desplazarse es muy pequeña, lo que provoca que sea muy difícil de frenar, y tienda a girar por sí mismo debido a la inercia del movimiento y a las fuerzas provocadas por las corrientes de aire debajo del chasis. Sin embargo, para este proyecto no se ha colocado una falda debajo del mismo, debido a que su diseño es bastante complicado, por lo tanto la fricción con el suelo es menor, aumentando los problemas detallados con anterioridad. El proyecto consta de dos partes, mando a distancia y aerodeslizador, que se conectan a través de antenas de radiofrecuencia (RF). El diseño y desarrollo de cada una ha sido realizado de manera separada exceptuando la parte de las comunicaciones entre ambas. El mando a distancia se divide en tres partes. La primera está compuesta por la interfaz de usuario y el circuito que genera las señales analógicas correspondientes a sus indicaciones. La interfaz de usuario la conforman tres potenciómetros: uno rotatorio y dos deslizantes. El rotatorio se utiliza para controlar la dirección de giro del aerodeslizador, mientras que cada uno de los deslizantes se emplea para controlar la fuerza del motor impulsor y del propulsor respectivamente. En los tres casos los potenciómetros se colocan en el circuito de manera que actúan como divisores de tensión controlables. La segunda parte se compone de un microcontrolador de la familia PSoC. Esta familia de microcontroladores se caracteriza por tener una gran adaptabilidad a la aplicación en la que se quieran utilizar debido a la posibilidad de elección de los periféricos, tanto analógicos como digitales, que forman parte del microcontrolador. Para el mando a distancia se configura con tres conversores A/D que se encargan de transformar las señales procedentes de los potenciómetros, tres amplificadores programables para trabajar con toda la escala de los conversores, un LCD que se utiliza para depurar el código en C con el que se programa y un módulo SPI que es la interfaz que conecta el microcontrolador con la antena. Además, se utilizan cuatro pines externos para elegir el canal de transmisión de la antena. La tercera parte es el módulo transceptor de radio frecuencia (RF) QFM-TRX1-24G, que en el mando a distancia funciona como transmisor. Éste utiliza codificación Manchester para asegurar bajas tasas de error. Como alimentación para los circuitos del mando a distancia se utilizan cuatro pilas AA de 1,5 voltios en serie. En el aerodeslizador se pueden distinguir cinco partes. La primera es el módulo de comunicaciones, que utiliza el mismo transceptor que en el mando a distancia, pero esta vez funciona como receptor y por lo tanto servirá como entrada de datos al sistema haciendo llegar las instrucciones del usuario. Este módulo se comunica con el siguiente, un microcontrolador de la familia PSoC, a través de una interfaz SPI. En este caso el microcontrolador se configura con: un modulo SPI, un LCD utilizado para depurar el código y tres módulos PWM (2 de 8 bits y uno de 16 bits) para controlar los motores y el servo del aerodeslizador. Además, se utilizan cuatro pines externos para seleccionar el canal de recepción de datos. La tercera y cuarta parte se pueden considerar conjuntamente. Ambas están compuestas por el mismo circuito electrónico basado en transistores MOSFET. A la puerta de cada uno de los transistores llega una señal PWM de 100 kilohercios que proviene del microcontrolador, que se encarga de controlar el modo de funcionamiento de los transistores, que llevan acoplado un disipador de calor para evitar que se quemen. A su vez, los transistores hacen funcionar al dos ventiladores, que actúan como motores, el impulsor y el propulsor del aerodeslizador. La quinta y última parte es un servo estándar para modelismo. El servo está controlado por una señal PWM, en la que la longitud del pulso positivo establece la posición de la cabeza del servo, girando en uno u otra dirección según las instrucciones enviadas desde el mando a distancia por el usuario. Para el aerodeslizador se han utilizado dos fuentes de alimentación diferentes: una compuesta por 4 pilas AA de 1,5 voltios en serie que alimentarán al microcontrolador y al servo, y 4 baterías de litio recargables de 3,2 voltios en serie que alimentan el circuito de los motores. La última parte del proyecto es el montaje y ensamblaje final de los dispositivos. Para el chasis del aerodeslizador se ha utilizado una cubierta rectangular de poli-estireno expandido, habitualmente encontrado en el embalaje de productos frágiles. Este material es bastante ligero y con una alta resistencia a los golpes, por lo que es ideal para el propósito del proyecto. En el chasis se han realizado dos agujeros: uno circular situado en el centro del mismo en el se introduce y se ajusta con pegamento el motor impulsor, y un agujero con la forma del servo, situado en uno del los laterales estrechos del rectángulo, en el que se acopla el mismo. El motor propulsor está adherido al cabezal giratorio del servo de manera que rota a la vez que él, haciendo girar al aerodeslizador. El resto de circuitos electrónicos y las baterías se fijan al chasis mediante cinta adhesiva y pegamento procurando en todo momento repartir el peso de manera homogénea por todo el chasis para aumentar la estabilidad del aerodeslizador. SUMMARY: In this final year project a remote controlled hovercraft was designed using mainly technology that is well known by students in the embedded systems programme. This platform could be used to develop further and more complex projects. The system was developed dividing the work into two parts: remote control and hovercraft. The hardware was of the hovercraft and the remote control was designed separately; however, the software was designed at the same time since it was needed to develop the communication system. The result of the project was a remote control hovercraft which has a user friendly interface. The system was designed based on microprocessor technologies and uses common remote control technologies. The system has been designed with technology commonly used by the students in Metropolia University so that it can be readily understood in order to develop other projects based on this platform.
Resumo:
En este Trabajo Fin de Master se desarrolla una aplicación basada en Labview diseñada para la adquisición automática de mapas de electroluminiscencia de células solares en general y células solares multiunión de concentración como caso particular, para diferentes condiciones de polarización. Este sistema permitirá la adquisición de mapas de electroluminescencia de cada una de las sub-células de una célula multiunión. Las variaciones espaciales en la intensidad de electroluminescencia medida podrán ser analizadas y correlacionadas con defectos de distintos tipos en la estructura semiconductora o en los contactos metálicos que forman el dispositivo de célula solar. En la parte teórica se presenta el estado del arte referente a la caracterización de células solares basada en la técnica de electroluminiscencia, así como los antecedentes del Instituto de Energía Solar (IES) referidos a este tema. Para el desarrollo de la parte práctica ha sido necesario diseñar dos drivers en Labview. El primer driver controla una fuente-medidor, que inyecta corriente a la célula solar y recoge datos de la tensión asociada. El segundo driver se utiliza para controlar y automatizar el proceso de adquisición, mediante sensor CCD, de la imagen electroluminiscente de la célula solar sometida a unas condiciones de polarización determinadas. Estos drivers se incluyen dentro de la aplicación final desarrollada, que ofrece al usuario una interfaz para la aplicación de diferentes condiciones de polarización a la célula solar y la adquisición de los mapas de electroluminescencia. La utilización de este sistema es fundamental en los estudios de degradación de células solares que se llevan a cabo actualmente en el Instituto de Energía Solar. De hecho, en este Trabajo Fin de Máster se han realizado las primeras medidas al respecto, cuyos resultados se presentan en la parte final de esta memoria. SUMMARY. This Master Final Project develops a Labview application designed to perform the automatic acquisition of solar cell electroluminescence maps in general, and concentrator multijunction solar cells as a special case, under forward biased conditions. This system allows the acquisition of electroluminescence maps of each of the sub-cells in a multijunction cell. The spatial variations in the intensity of the electroluminescence measured can be analyzed and correlated with defects in the semiconductor structure or in the metal contacts of the solar cell. In the theory section of this memory, the state of the art of the electroluminescence-based characterization techniques for solar cells is presented, and the previous work carried out at I.E.S. is summarized. For the development of the practice part it has been necessary to design two drivers using Labview software. The first driver handles the source-meter injecting current in the solar cell and measuring voltage between its terminals. The second driver is used to handle and automate the acquisition of the solar cell electroluminescence image under forward biased conditions, using a CCD sensor. These drivers are included in the final application, which offers the user an interface to apply different bias conditions to the solar cell and for the acquisition of electroluminescence maps. The use of this system is essential in the studies of degradation of solar cells which is currently underway at the I.E.S. – U.P.M. In this Master Final Project the results of the first measurements are carried out which are presented in the final part of this memory.
Resumo:
Los sistemas basados en la técnica OFDM (Multiplexación por División de Frecuencias Ortogonales) son una evolución de los tradicionales sistemas FDM (Multiplexación por División de Frecuencia), gracias a la cual se consigue un mejor aprovechamiento del ancho de banda. En la actualidad los sistemas OFDM y sus variantes ocupan un lugar muy importante en las comunicaciones, estando implementados en diversos estándares como pueden ser: DVB-T (estándar de la TDT), ADSL, LTE, WIMAX, DAB (radio digital), etc. Debido a ello, en este proyecto se implementa un sistema OFDM en el que poder realizar diversas simulaciones para entender mejor su funcionamiento. Para ello nos vamos a valer de la herramienta Matlab. Los objetivos fundamentales dentro de la simulación del sistema es poner a prueba el empleo de turbo códigos (comparándolo con los códigos convolucionales tradicionales) y de un ecualizador. Todo ello con la intención de mejorar la calidad de nuestro sistema (recibir menos bits erróneos) en condiciones cada vez más adversas: relaciones señal a ruido bajas y multitrayectos. Para ello se han implementado las funciones necesarias en Matlab, así como una interfaz gráfica para que sea más sencillo de utilizar el programa y más didáctico. En los capítulos segundo y tercero de este proyecto se efectúa un estudio de las bases de los sistemas OFDM. En el segundo nos centramos más en un estudio teórico puro para después pasar en el tercero a centrarnos únicamente en la teoría de los bloques implementados en el sistema OFDM que se desarrolla en este proyecto. En el capítulo cuarto se explican las distintas opciones que se pueden llevar a cabo mediante la interfaz implementada, a la vez que se elabora un manual para el correcto uso de la misma. El quinto capítulo se divide en dos partes, en la primera se muestran las representaciones que puede realizar el programa, y en la segunda únicamente se realizan simulaciones para comprobar que tal responde nuestra sistema a distintas configuraciones de canal, y las a distintas configuraciones que hagamos nosotros de nuestro sistema (utilicemos una codificación u otra, utilicemos el ecualizador o el prefijo cíclico, etc…). Para finalizar, en el último capítulo se exponen las conclusiones obtenidas en este proyecto, así como posibles líneas de trabajo que seguir en próximas versiones del mismo. ABSTRACT. Systems based on OFDM (Orthogonal Frequency Division Multiplexing) technique are an evolution of traditional FDM (Frequency Division Multiplexing). Due to the use of OFDM systems are achieved by more efficient use of bandwidth. Nowadays, OFDM systems and variants of OFDM systems occupy a very important place in the world of communications, being implemented in standards such as DVB-T, ADSL, LTE, WiMAX, DAB (digital radio) and another more. For all these reasons, this project implements a OFDM system for performing various simulations for better understanding of OFDM system operation. The system has been simulated using Matlab. With system simulation we search to get two key objectives: to test the use of turbo codes (compared to traditional convolutional codes) and an equalizer. We do so with the intention of improving the quality of our system (receive fewer rates of bit error) in increasingly adverse conditions: lower signal-to-noise and multipath. For these reasons necessaries Matlab´s functions have been developed, and a GUI (User Graphical Interface) has been integrated so the program can be used in a easier and more didactic way. This project is divided into five chapters. In the second and third chapter of this project are developed the basis of OFDM systems. Being developed in the second one a pure theoretical study, while focusing only on block theory implemented in the OFDM system in the third one. The fourth chapter describes the options that can be carried out by the interface implemented. Furthermore the chapter is developed for the correct use of the interface. The fifth chapter is divided into two parts, the first part shows to us the representations that the program can perform, and the second one just makes simulations to check that our system responds to differents channel configurations (use of convolutional codes or turbo codes, the use of equalizer or cyclic prefix…). Finally, the last chapter presents the conclusions of this project and possible lines of work to follow in future versions.
Resumo:
A new version of the TomoRebuild data reduction software package is presented, for the reconstruction of scanning transmission ion microscopy tomography (STIMT) and particle induced X-ray emission tomography (PIXET) images. First, we present a state of the art of the reconstruction codes available for ion beam microtomography. The algorithm proposed here brings several advantages. It is a portable, multi-platform code, designed in C++ with well-separated classes for easier use and evolution. Data reduction is separated in different steps and the intermediate results may be checked if necessary. Although no additional graphic library or numerical tool is required to run the program as a command line, a user friendly interface was designed in Java, as an ImageJ plugin. All experimental and reconstruction parameters may be entered either through this plugin or directly in text format files. A simple standard format is proposed for the input of experimental data. Optional graphic applications using the ROOT interface may be used separately to display and fit energy spectra. Regarding the reconstruction process, the filtered backprojection (FBP) algorithm, already present in the previous version of the code, was optimized so that it is about 10 times as fast. In addition, Maximum Likelihood Expectation Maximization (MLEM) and its accelerated version Ordered Subsets Expectation Maximization (OSEM) algorithms were implemented. A detailed user guide in English is available. A reconstruction example of experimental data from a biological sample is given. It shows the capability of the code to reduce noise in the sinograms and to deal with incomplete data, which puts a new perspective on tomography using low number of projections or limited angle.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
This paper explores the complexities and contradictions of frontline practice that pose problems for personalised social care through enhanced choice. It draws on semi-structured interviews with community care workers, social workers, occupational therapists and care managers in a social service department. Practitioners interviewed were asked about their current assessment and documentation system, including the assessment documents currently used; how they approached information gathering and the topics they explored with service users; and their experience of documenting assessment and care management. The paper argues that the validity and sustainability of personalised social care in frontline practice relies on developing a thorough understanding of the complex and implicit assessment processes operating at the service user/practitioner interface and the inevitable tensions that arise for practitioners associated with the organisational context and broader service environment. The findings demonstrate the variability among practitioners in how they collect information and more importantly, the critical role practitioners occupy in determining the kinds of topics to be explored during the assessment process. In so doing, it shows how practitioners can exert control over the decision-making process. More importantly, it provides some insight into how such processes are shaped by the constraints of the organisational context and broader service environment. Complexities and contradictions may be an inherent part of frontline practice. The issues discussed in this paper, however, highlight potential areas that might be targeted in conjunction with implementing personalised social care through enhanced choice for people with disabilities.
Resumo:
This dissertation is about the research carried on developing an MPS (Multipurpose Portable System) which consists of an instrument and many accessories. The instrument is portable, hand-held, and rechargeable battery operated, and it measures temperature, absorbance, and concentration of samples by using optical principles. The system also performs auxiliary functions like incubation and mixing. This system can be used in environmental, industrial, and medical applications. ^ Research emphasis is on system modularity, easy configuration, accuracy of measurements, power management schemes, reliability, low cost, computer interface, and networking. The instrument can send the data to a computer for data analysis and presentation, or to a printer. ^ This dissertation includes the presentation of a full working system. This involved integration of hardware and firmware for the micro-controller in assembly language, software in C and other application modules. ^ The instrument contains the Optics, Transimpedance Amplifiers, Voltage-to-Frequency Converters, LCD display, Lamp Driver, Battery Charger, Battery Manager, Timer, Interface Port, and Micro-controller. ^ The accessories are a Printer, Data Acquisition Adapter (to transfer the measurements to a computer via the Printer Port and expand the Analog/Digital conversion capability), Car Plug Adapter, and AC Transformer. This system has been fully evaluated for fault tolerance and the schemes will also be presented. ^
Resumo:
This dissertation established a state-of-the-art programming tool for designing and training artificial neural networks (ANNs) and showed its applicability to brain research. The developed tool, called NeuralStudio, allows users without programming skills to conduct studies based on ANNs in a powerful and very user friendly interface. A series of unique features has been implemented in NeuralStudio, such as ROC analysis, cross-validation, network averaging, topology optimization, and optimization of the activation function’s slopes. It also included a Support Vector Machines module for comparison purposes. Once the tool was fully developed, it was applied to two studies in brain research. In the first study, the goal was to create and train an ANN to detect epileptic seizures from subdural EEG. This analysis involved extracting features from the spectral power in the gamma frequencies. In the second application, a unique method was devised to link EEG recordings to epileptic and non-epileptic subjects. The contribution of this method consisted of developing a descriptor matrix that can be used to represent any EEG file regarding its duration and the number of electrodes. The first study showed that the inter-electrode mean of the spectral power in the gamma frequencies and its duration above a specific threshold performs better than the other frequencies in seizure detection, exhibiting an accuracy of 95.90%, a sensitivity of 92.59%, and a specificity of 96.84%. The second study yielded that Hjorth’s parameter activity is sufficient to accurately relate EEG to epileptic and non-epileptic subjects. After testing, accuracy, sensitivity and specificity of the classifier were all above 0.9667. Statistical tests measured the superiority of activity at over 99.99 % certainty. It was demonstrated that 1) the spectral power in the gamma frequencies is highly effective in locating seizures from EEG and 2) activity can be used to link EEG recordings to epileptic and non-epileptic subjects. These two studies required high computational load and could be addressed thanks to NeuralStudio. From a medical perspective, both methods proved the merits of NeuralStudio in brain research applications. For its outstanding features, NeuralStudio has been recently awarded a patent (US patent No. 7502763).