957 resultados para Education - Data processing


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the context of focal epilepsy, the simultaneous combination of electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) holds a great promise as a technique by which the hemodynamic correlates of interictal spikes detected on scalp EEG can be identified. The fact that traditional EEG recordings have not been able to overcome the difficulty in correlating the ictal clinical symptoms to the onset in particular areas of the lobes, brings the need of mapping with more precision the epileptogenic cortical regions. On the other hand, fMRI suggested localizations more consistent with the ictal clinical manifestations detected. This study was developed in order to improve the knowledge about the way parameters involved in the physical and mathematical data, produced by the EEG/fMRI technique processing, would influence the final results. The evaluation of the accuracy was made by comparing the BOLD results with: the high resolution EEG maps; the malformative lesions detected in the T1 weighted MR images; and the anatomical localizations of the diagnosed symptomatology of each studied patient. The optimization of the set of parameters used, will provide an important contribution to the diagnosis of epileptogenic focuses, in patients included on an epilepsy surgery evaluation program. The results obtained allowed us to conclude that: by associating the BOLD effect with interictal spikes, the epileptogenic areas are mapped to localizations different from those obtained by the EEG maps representing the electrical potential distribution across the scalp (EEG); there is an important and solid bond between the variation of particular parameters (manipulated during the fMRI data processing) and the optimization of the final results, from which smoothing, deleted volumes, HRF (used to convolve with the activation design), and the shape of the Gamma function can be certainly emphasized.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação apresentada para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The data acquisition process in real-time is fundamental to provide appropriate services and improve health professionals decision. In this paper a pervasive adaptive data acquisition architecture of medical devices (e.g. vital signs, ventilators and sensors) is presented. The architecture was deployed in a real context in an Intensive Care Unit. It is providing clinical data in real-time to the INTCare system. The gateway is composed by several agents able to collect a set of patients’ variables (vital signs, ventilation) across the network. The paper shows as example the ventilation acquisition process. The clients are installed in a machine near the patient bed. Then they are connected to the ventilators and the data monitored is sent to a multithreading server which using Health Level Seven protocols records the data in the database. The agents associated to gateway are able to collect, analyse, interpret and store the data in the repository. This gateway is composed by a fault tolerant system that ensures a data store in the database even if the agents are disconnected. The gateway is pervasive, universal, and interoperable and it is able to adapt to any service using streaming data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There are far-reaching conceptual similarities between bi-static surface georadar and post-stack, "zero-offset" seismic reflection data, which is expressed in largely identical processing flows. One important difference is, however, that standard deconvolution algorithms routinely used to enhance the vertical resolution of seismic data are notoriously problematic or even detrimental to the overall signal quality when applied to surface georadar data. We have explored various options for alleviating this problem and have tested them on a geologically well-constrained surface georadar dataset. Standard stochastic and direct deterministic deconvolution approaches proved to be largely unsatisfactory. While least-squares-type deterministic deconvolution showed some promise, the inherent uncertainties involved in estimating the source wavelet introduced some artificial "ringiness". In contrast, we found spectral balancing approaches to be effective, practical and robust means for enhancing the vertical resolution of surface georadar data, particularly, but not exclusively, in the uppermost part of the georadar section, which is notoriously plagued by the interference of the direct air- and groundwaves. For the data considered in this study, it can be argued that band-limited spectral blueing may provide somewhat better results than standard band-limited spectral whitening, particularly in the uppermost part of the section affected by the interference of the air- and groundwaves. Interestingly, this finding is consistent with the fact that the amplitude spectrum resulting from least-squares-type deterministic deconvolution is characterized by a systematic enhancement of higher frequencies at the expense of lower frequencies and hence is blue rather than white. It is also consistent with increasing evidence that spectral "blueness" is a seemingly universal, albeit enigmatic, property of the distribution of reflection coefficients in the Earth. Our results therefore indicate that spectral balancing techniques in general and spectral blueing in particular represent simple, yet effective means of enhancing the vertical resolution of surface georadar data and, in many cases, could turn out to be a preferable alternative to standard deconvolution approaches.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Este proyecto final de carrera pertenece al área de Competencias Profesionales y tiene como objetivo el análisis de experiencias de enseñanza-aprendizaje de la competencia de comunicación escrita en currículos TIC. El estudio se compone de tres partes: contextualización, investigación y reflexión. En la contextualización se define el concepto de competencia profesional y se clasifican las competencias genéricas o transversales en: competencias instrumentales, competencias interpersonales y competencias sistémicas. Por último, se indicarán las competencias genéricas para un Ingeniero en Informática, según el libro blanco para el título de grado de Ingeniería Informática. La investigación se ha llevado a cabo en los planes de estudio de Grado en Ingeniería Informática de 20 universidades españolas. En una primera parte se buscará qué universidades contemplan, en sus planes de estudio, competencias genéricas y realizaremos una clasificación. La segunda parte de investigación, se centrará en localizar la competencia de comunicación escrita y los objetivos de competencia de comunicación escrita. En la parte de la reflexión se identificarán las competencias genéricas explícitas e implícitas desarrolladas en el plan de estudios cursado en el itinerario formativo de la UOC. En esta parte también se analizará el modelo educativo de la UOC. El motivo de este proyecto de investigación es comprobar si los planes de estudio de Grado en Ingeniería Informática se han adaptado al EEES, en concreto, ver si las universidades seleccionadas tienen la intención de desarrollar la competencia comunicativa escrita. Esto nos permitirá analizar si un Graduado en Ingeniería Informática ha recibido una formación adecuada para conseguir dicha competencia.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aquest treball aprofundeix en el Departament d'Educació com a gran empresa, n'analitza els objectius, les funcions i l'estructura organitzativa i estudia els avantatges que podria aportar la implantació d'un ERP. També estudia amb detall els passos que s'haurien de seguir per a implantar SAP R/3 i n'assenyala els punts febles per tal de prevenir-los i assegurar l'èxit de la implantació.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Configuració d'un entorn de desenvolupament en el IDE Eclipse. Introducció als SIG. Usos, utilitats i exemples. Conèixer la eina gvSIG. Conèixer els estàndards més estesos de l'Open Geospatial Consortium (OGC) i en especial del Web Processing Services. Analitzar, dissenyar i desenvolupar un client capaç de consumir serveis wps.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fent ús del programari lliure s'ha generat la unitat didàctica del joc del terceti per a la televisió digital interactiva i s'ha desada en format SCORM. La facilitat d'ús del comandament a distància fa possible practicar un tipus de t-learning fonamentat en l'edutainment (education+entertaiment) i provoca que el teleespectador passiu passi a ser un "teleusuari" més actiu.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Estudi dels estàndards definits per l'Open Geospatial Consortium, i més concretament en l'estàndard Web Processing Service (wps). Així mateix, ha tingut una component pràctica que ha consistit en el disseny i desenvolupament d'un client capaç de consumir serveis Web creats segons wps i integrat a la plataforma gvSIG.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Análisis de experiencias de enseñanza-aprendizaje de la competencia comunicativaescrita a nivel de Ingeniería Informática.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Projecte d'actualització del sistema informàtic d'un centre escolar.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents the "state of the art" about distributed systems and applications and it's focused on teaching about these systems. It presents different platforms where to run distributed applications and describes some development toolkits whose can be used to develop prototypes, practices and distributed applications. It also presents some existing distributed algorithms useful for class practices, and some tools to help managing distributed environments. Finally, the paper presents some teaching experiences with different approaches on how to teach about distributed systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The sparsely spaced highly permeable fractures of the granitic rock aquifer at Stang-er-Brune (Brittany, France) form a well-connected fracture network of high permeability but unknown geometry. Previous work based on optical and acoustic logging together with single-hole and cross-hole flowmeter data acquired in 3 neighbouring boreholes (70-100 m deep) has identified the most important permeable fractures crossing the boreholes and their hydraulic connections. To constrain possible flow paths by estimating the geometries of known and previously unknown fractures, we have acquired, processed and interpreted multifold, single- and cross-hole GPR data using 100 and 250 MHz antennas. The GPR data processing scheme consisting of timezero corrections, scaling, bandpass filtering and F-X deconvolution, eigenvector filtering, muting, pre-stack Kirchhoff depth migration and stacking was used to differentiate fluid-filled fracture reflections from source generated noise. The final stacked and pre-stack depth-migrated GPR sections provide high-resolution images of individual fractures (dipping 30-90°) in the surroundings (2-20 m for the 100 MHz antennas; 2-12 m for the 250 MHz antennas) of each borehole in a 2D plane projection that are of superior quality to those obtained from single-offset sections. Most fractures previously identified from hydraulic testing can be correlated to reflections in the single-hole data. Several previously unknown major near vertical fractures have also been identified away from the boreholes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Solexa/Illumina short-read ultra-high throughput DNA sequencing technology produces millions of short tags (up to 36 bases) by parallel sequencing-by-synthesis of DNA colonies. The processing and statistical analysis of such high-throughput data poses new challenges; currently a fair proportion of the tags are routinely discarded due to an inability to match them to a reference sequence, thereby reducing the effective throughput of the technology. RESULTS: We propose a novel base calling algorithm using model-based clustering and probability theory to identify ambiguous bases and code them with IUPAC symbols. We also select optimal sub-tags using a score based on information content to remove uncertain bases towards the ends of the reads. CONCLUSION: We show that the method improves genome coverage and number of usable tags as compared with Solexa's data processing pipeline by an average of 15%. An R package is provided which allows fast and accurate base calling of Solexa's fluorescence intensity files and the production of informative diagnostic plots.