457 resultados para artifact
Resumo:
Simultaneous acquisition of electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) aims to disentangle the description of brain processes by exploiting the advantages of each technique. Most studies in this field focus on exploring the relationships between fMRI signals and the power spectrum at some specific frequency bands (alpha, beta, etc.). On the other hand, brain mapping of EEG signals (e.g., interictal spikes in epileptic patients) usually assumes an haemodynamic response function for a parametric analysis applying the GLM, as a rough approximation. The integration of the information provided by the high spatial resolution of MR images and the high temporal resolution of EEG may be improved by referencing them by transfer functions, which allows the identification of neural driven areas without strong assumptions about haemodynamic response shapes or brain haemodynamic`s homogeneity. The difference on sampling rate is the first obstacle for a full integration of EEG and fMRI information. Moreover, a parametric specification of a function representing the commonalities of both signals is not established. In this study, we introduce a new data-driven method for estimating the transfer function from EEG signal to fMRI signal at EEG sampling rate. This approach avoids EEG subsampling to fMRI time resolution and naturally provides a test for EEG predictive power over BOLD signal fluctuations, in a well-established statistical framework. We illustrate this concept in resting state (eyes closed) and visual simultaneous fMRI-EEG experiments. The results point out that it is possible to predict the BOLD fluctuations in occipital cortex by using EEG measurements. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
Purpose The purpose of this report was to demonstrate the normal complex insertional anatomy of the tibialis posterior tendon (TPT) in cadavers using magnetic resonance (MR) imaging with anatomic and histologic correlation. Material and methods Ten cadaveric ankles were used according to institutional guidelines. MR T1-weighted spin echo imaging was performed to demonstrate aspects of the complex anatomic distal insertions of the TPT in cadaveric specimens. Findings on MR imaging were correlated with those derived from anatomic and histologic study. Reults Generally, the TPT revealed a low signal in all MR images, except near the level of the medial malleolus, where the TPT suddenly changed direction and ""magic angle"" artifact could be observed. In five out of ten specimens (50%), a type I accessory navicular bone was found in the TPT. In all cases with a type I accessory navicular bone, the TPT had an altered signal in this area. Axial and coronal planes on MR imaging were the best in identifying the distal insertions of the TPT. A normal division of the TPT was observed just proximal to the insertion into the navicular bone in five specimens (100%) occurring at a maximum proximal distance from its attachment to the navicular bone of approximately 1.5 to 2 cm. In the other five specimens, in which a type I accessory navicular bone was present, the TPT directly inserted into the accessory bone and a slip less than 1.5 mm in thickness could be observed attaching to the medial aspect of the navicular bone (100%). Anatomic inspection confirmed the sites of the distal insertions of the components of the TPT. Conclusion MR imaging enabled detailed analysis of the complex distal insertions of the TPT as well as a better understanding of those features of its insertion that can simulate a lesion.
Resumo:
OBJECTIVE. The purpose of the study was to investigate patient characteristics associated with image quality and their impact on the diagnostic accuracy of MDCT for the detection of coronary artery stenosis. MATERIALS AND METHODS. Two hundred ninety-one patients with a coronary artery calcification (CAC) score of <= 600 Agatston units (214 men and 77 women; mean age, 59.3 +/- 10.0 years [SD]) were analyzed. An overall image quality score was derived using an ordinal scale. The accuracy of quantitative MDCT to detect significant (>= 50%) stenoses was assessed using quantitative coronary angiography (QCA) per patient and per vessel using a modified 19-segment model. The effect of CAC, obesity, heart rate, and heart rate variability on image quality and accuracy were evaluated by multiple logistic regression. Image quality and accuracy were further analyzed in subgroups of significant predictor variables. Diagnostic analysis was determined for image quality strata using receiver operating characteristic (ROC) curves. RESULTS. Increasing body mass index (BMI) (odds ratio [OR] = 0.89, p < 0.001), increasing heart rate (OR = 0.90, p < 0.001), and the presence of breathing artifact (OR = 4.97, p = 0.001) were associated with poorer image quality whereas sex, CAC score, and heart rate variability were not. Compared with examinations of white patients, studies of black patients had significantly poorer image quality (OR = 0.58, p = 0.04). At a vessel level, CAC score (10 Agatston units) (OR = 1.03, p = 0.012) and patient age (OR = 1.02, p = 0.04) were significantly associated with the diagnostic accuracy of quantitative MDCT compared with QCA. A trend was observed in differences in the areas under the ROC curves across image quality strata at the vessel level (p = 0.08). CONCLUSION. Image quality is significantly associated with patient ethnicity, BMI, mean scan heart rate, and the presence of breathing artifact but not with CAC score at a patient level. At a vessel level, CAC score and age were associated with reduced diagnostic accuracy.
Resumo:
The cytochrome P450 (P450) enzymes involved in drug metabolism are among the most versatile biological catalysts known. A small number of discrete forms of human P450 are capable of catalyzing the monooxygenation of a practically unlimited variety of xenobiotic substrates, with each enzyme showing a more or less wide and overlapping substrate range. This versatility makes P450s ideally suited as starting materials for engineering designer catalysts for industrial applications. In the course of heterologous expression of P450s in bacteria, we observed the unexpected formation of blue pigments. Although this was initially assumed to be an artifact, subsequent work led to the discovery of a new function of P450s in intermediary metabolism and toxicology, new screens for protein engineering, and potential applications in the dye and horticulture industries.
Resumo:
Neurological disease or dysfunction in newborn infants is often first manifested by seizures. Prolonged seizures can result in impaired neurodevelopment or even death. In adults, the clinical signs of seizures are well defined and easily recognized. In newborns, however, the clinical signs are subtle and may be absent or easily missed without constant close observation. This article describes the use of adaptive signal processing techniques for removing artifacts from newborn electroencephalogram (EEG) signals. Three adaptive algorithms have been designed in the context of EEG signals. This preprocessing is necessary before attempting a fine time-frequency analysis of EEG rhythmical activities, such as electrical seizures, corrupted by high amplitude signals. After an overview of newborn EEG signals, the authors describe the data acquisition set-up. They then introduce the basic physiological concepts related to normal and abnormal newborn EEGs and discuss the three adaptive algorithms for artifact removal. They also present time-frequency representations (TFRs) of seizure signals and discuss the estimation and modeling of the instantaneous frequency related to the main ridge of the TFR.
Resumo:
Esta pesquisa analisou a resistência ao currículo de História para o ensino médio prescrito pela Secretaria de Estado da Educação do Estado do Espírito Santo (Sedu) em 2009, para ser desenvolvido em sua rede de ensino pelos professores dessa etapa da educação básica. Seu objetivo foi investigar as causas de resistências assentadas ao documento e identificar a que os professores resistem, por que os professores resistem e como os professores estão materializando sua resistência a ele. Por resistência entende-se o conjunto de práticas exercidas pelos professores que se anunciam sob a forma de oposição, na tentativa de barrar a dominação, de não perder sua identidade. Uma resistência consciente que, apesar de rejeitar, não nega o currículo. Porém, a ele não se submete passivamente, numa posição de quem reivindica sua reelaboração, sua reinvenção. Para fundamentação teórica, ocorreram pesquisas e estudos de produções e conceitos sobre currículo, resistência, ensino médio e suas relações com a educação. O trabalho encontra-se na área de educação, na linha de pesquisa Cultura, Currículo e Formação de Educadores. A pesquisa é de cunho qualitativo e amparou-se na abordagem narrativa. Como procedimentos metodológicos, apoiou-se na análise documental e bibliográfica, questionário pré-estruturado, observações e conversas com quatro professoras de História de ensino médio no município de Afonso Cláudio, Estado do Espírito Santo. Com o cotejamento dos dados produzidos, o pressuposto apresentado neste trabalho foi confirmado. Como dimensões geradoras de resistências, ficaram evidenciadas a prescrição, considerando que as professoras ajuízam ser essa uma atribuição delas, junto com a escola; a organização dos conteúdos apresentada pela Sedu; a ausência de linearidade dos acontecimentos históricos; a disposição dos saberes por eixos temáticos; a orientação pelo trabalho interdisciplinar; a desvinculação dos conteúdos de cada série/ano do livro didático; a exigência burocrática com a implantação do currículo. A contribuição do trabalho para a Rede Estadual de Ensino foi a problematização da resistência ao currículo, artefato educacional que pode produzir estabilidades ou tensões entre os sujeitos que o envolvem, podendo ser útil para discussões posteriores. Para as educadoras, o trabalho foi relevante por ter promovido espaço de debates sobre o currículo de História do ensino médio no decurso das conversas na escola.
Resumo:
O desenvolvimento de software orientado a modelos defende a utilização dos modelos como um artefacto que participa activamente no processo de desenvolvimento. O modelo ocupa uma posição que se encontra ao mesmo nível do código. Esta é uma abordagem importante que tem sido alvo de atenção crescente nos últimos tempos. O Object Management Group (OMG) é o responsável por uma das principais especificações utilizadas na definição da arquitectura dos sistemas cujo desenvolvimento é orientado a modelos: o Model Driven Architecture (MDA). Os projectos que têm surgido no âmbito da modelação e das linguagens específicas de domínio para a plataforma Eclipse são um bom exemplo da atenção dada a estas áreas. São projectos totalmente abertos à comunidade, que procuram respeitar os standards e que constituem uma excelente oportunidade para testar e por em prática novas ideias e abordagens. Nesta dissertação foram usadas ferramentas criadas no âmbito do Amalgamation Project, desenvolvido para a plataforma Eclipse. Explorando o UML e usando a linguagem QVT, desenvolveu-se um processo automático para extrair elementos da arquitectura do sistema a partir da definição de requisitos. Os requisitos são representados por modelos UML que são transformados de forma a obter elementos para uma aproximação inicial à arquitectura do sistema. No final, obtêm-se um modelo UML que agrega os componentes, interfaces e tipos de dados extraídos a partir dos modelos dos requisitos. É uma abordagem orientada a modelos que mostrou ser exequível, capaz de oferecer resultados práticos e promissora no que concerne a trabalho futuro.
Resumo:
O desenvolvimento de recursos multilingues robustos para fazer face às exigências crescentes na complexidade dos processos intra e inter-organizacionais é um processo complexo que obriga a um aumento da qualidade nos modos de interacção e partilha dos recursos das organizações, através, por exemplo, de um maior envolvimento dos diferentes interlocutores em formas eficazes e inovadoras de colaboração. É um processo em que se identificam vários problemas e dificuldades, como sendo, no caso da criação de bases de dados lexicais multilingues, o desenvolvimento de uma arquitectura capaz de dar resposta a um conjunto vasto de questões linguísticas, como a polissemia, os padrões lexicais ou os equivalentes de tradução. Estas questões colocam-se na construção quer dos recursos terminológicos, quer de ontologias multilingues. No caso da construção de uma ontologia em diferentes línguas, processo no qual focalizaremos a nossa atenção, as questões e a complexidade aumentam, dado o tipo e propósitos do artefacto semântico, os elementos a localizar (conceitos e relações conceptuais) e o contexto em que o processo de localização ocorre. Pretendemos, assim, com este artigo, analisar o conceito e o processo de localização no contexto dos sistemas de gestão do conhecimento baseados em ontologias, tendo em atenção o papel central da terminologia no processo de localização, as diferentes abordagens e modelos propostos, bem como as ferramentas de base linguística que apoiam a implementação do processo. Procuraremos, finalmente, estabelecer alguns paralelismos entre o processo tradicional de localização e o processo de localização de ontologias, para melhor o situar e definir.
Resumo:
WWW is a huge, open, heterogeneous system, however its contents data is mainly human oriented. The Semantic Web needs to assure that data is readable and “understandable” to intelligent software agents, though the use of explicit and formal semantics. Ontologies constitute a privileged artifact for capturing the semantic of the WWW data. Temporal and spatial dimensions are transversal to the generality of knowledge domains and therefore are fundamental for the reasoning process of software agents. Representing temporal/spatial evolution of concepts and their relations in OWL (W3C standard for ontologies) it is not straightforward. Although proposed several strategies to tackle this problem but there is still no formal and standard approach. This work main goal consists of development of methods/tools to support the engineering of temporal and spatial aspects in intelligent systems through the use of OWL ontologies. An existing method for ontology engineering, Fonte was used as framework for the development of this work. As main contributions of this work Fonte was re-engineered in order to: i) support the spatial dimension; ii) work with OWL Ontologies; iii) and support the application of Ontology Design Patterns. Finally, the capabilities of the proposed approach were demonstrated by engineering time and space in a demo ontology about football.
Resumo:
The particular characteristics and affordances of technologies play a significant role in human experience by defining the realm of possibilities available to individuals and societies. Some technological configurations, such as the Internet, facilitate peer-to-peer communication and participatory behaviors. Others, like television broadcasting, tend to encourage centralization of creative processes and unidirectional communication. In other instances still, the affordances of technologies can be further constrained by social practices. That is the case, for example, of radio which, although technically allowing peer-to-peer communication, has effectively been converted into a broadcast medium through the legislation of the airwaves. How technologies acquire particular properties, meanings and uses, and who is involved in those decisions are the broader questions explored here. Although a long line of thought maintains that technologies evolve according to the logic of scientific rationality, recent studies demonstrated that technologies are, in fact, primarily shaped by social forces in specific historical contexts. In this view, adopted here, there is no one best way to design a technological artifact or system; the selection between alternative designs—which determine the affordances of each technology—is made by social actors according to their particular values, assumptions and goals. Thus, the arrangement of technical elements in any technological artifact is configured to conform to the views and interests of those involved in its development. Understanding how technologies assume particular shapes, who is involved in these decisions and how, in turn, they propitiate particular behaviors and modes of organization but not others, requires understanding the contexts in which they are developed. It is argued here that, throughout the last century, two distinct approaches to the development and dissemination of technologies have coexisted. In each of these models, based on fundamentally different ethoi, technologies are developed through different processes and by different participants—and therefore tend to assume different shapes and offer different possibilities. In the first of these approaches, the dominant model in Western societies, technologies are typically developed by firms, manufactured in large factories, and subsequently disseminated to the rest of the population for consumption. In this centralized model, the role of users is limited to selecting from the alternatives presented by professional producers. Thus, according to this approach, the technologies that are now so deeply woven into human experience, are primarily shaped by a relatively small number of producers. In recent years, however, a group of three interconnected interest groups—the makers, hackerspaces, and open source hardware communities—have increasingly challenged this dominant model by enacting an alternative approach in which technologies are both individually transformed and collectively shaped. Through a in-depth analysis of these phenomena, their practices and ethos, it is argued here that the distributed approach practiced by these communities offers a practical path towards a democratization of the technosphere by: 1) demystifying technologies, 2) providing the public with the tools and knowledge necessary to understand and shape technologies, and 3) encouraging citizen participation in the development of technologies.
Resumo:
Tese de Doutoramento em Psicologia - Especialidade em Psicologia Experimental e Ciências Cognitivas
Resumo:
The aim of this study was not only to determine the red blood cells parameters, thrombocyte and leukocyte counts in farmed Brycon amazonicus (matrinxã), to compare these parameters among Bryconinae species from literature, and also to investigate the presence of special granulocytic cells in these fish. The results of the blood cells parameters here established for farmed B. amazonicus, a species of great economic importance in Brazilian aquaculture, could help a better understanding of the blood features in natural populations of this Amazon species. Blood parameters varied between Bryconinae species investigated, mainly the red blood cell counts, hemoglobin, hematocrit and mean corpuscular volume (MCV). The presence of the blood granulocytes, neutrophils and heterophils in matrinxã suggest that both leukocytes can be a characteristic for Bryconinae family. Furthermore, it indicates that the existence of special granulocytic cells in the blood of Bryconinae species from literature is an artifact, and this was herein discussed.
Resumo:
Tese de Doutoramento em Psicologia (Especialidade de Psicologia Experimental e Ciências Cognitivas)
Resumo:
A spreadsheet usually starts as a simple and singleuser software artifact, but, as frequent as in other software systems, quickly evolves into a complex system developed by many actors. Often, different users work on different aspects of the same spreadsheet: while a secretary may be only involved in adding plain data to the spreadsheet, an accountant may define new business rules, while an engineer may need to adapt the spreadsheet content so it can be used by other software systems.Unfortunately,spreadsheetsystemsdonotoffermodular mechanisms, and as a consequence, some of the previous tasks may be defined by adding intrusive “code” to the spreadsheet. In this paper we go through the design and implementation of an aspect-oriented language for spreadsheets so that users can work on different aspects of a spreadsheet in a modular way. For example, aspects can be defined in order to introduce new business rules to an existing spreadsheet, or to manipulate the spreadsheet data to be ported to another system. Aspects are defined as aspect-oriented program specifications that are dynamically woven into the underlying spreadsheet by an aspect weaver. In this aspect-oriented style of spreadsheet development, differentusers develop,orreuse,aspects withoutaddingintrusive code to the original spreadsheet. Such code is added/executed by the spreadsheet weaving mechanism proposed in this paper.
Resumo:
Dissertação de mestrado integrado em Engenharia Biomédica (área de especialização em Informática Médica)