990 resultados para Online Identification


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The occurrence of transients in electrocardiogram (ECG) signals indicates an electrical phenomenon outside the heart. Thus, the identification of transients has been the most-used methodology in medical analysis since the invention of the electrocardiograph (device responsible for benchmarking of electrocardiogram signals). There are few papers related to this subject, which compels the creation of an architecture to do the pre-processing of this signal in order to identify transients. This paper proposes a method based on the signal energy of the Hilbert transform of electrocardiogram, being an alternative to methods based on morphology of the signal. This information will determine the creation of frames of the MP-HA protocol responsible for transmitting the ECG signals through an IEEE 802.3 network to a computing device. That, in turn, may perform a process to automatically sort the signal, or to present it to a doctor so that he can do the sorting manually

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Small nuclear ribonucleoproteins (snRNPs)are involved in trans-splicing processing of pre-mRNA in Trypanosoma cruzi. To clone T. cruzi snRNPs we screened an epimastigote cDNA library with a purified antibody raised against the Sm-binding site of a yeast sequence. A clone was obtained containing a 507 bp-insert with an ORF of 399 bp and coding for a protein of 133 amino acids. Sequence analysis revealed high identity with the L27 ribosomal proteins from different species including: Canis familiaris, Homo sapiens, Schizosaccharomyces pombe and Saccharomyces cerevisiae. This protein has not been previously described in the literature and seems to be a new ribosomal protein in T. cruzi and was given the code TcrL27. To express this recombinant T. cruzi L27 ribosomal protein in E. coli, the insert was subcloned into the pET32a vector and a 26 kDa recombinant protein was purified. Immunoblotting studies demonstrated that this purified recombinant protein was recognized by the same anti-Sm serum used in the library screening as well as by chagasic and systemic lupus erythemathosus (SLE) sera. Our results suggest that the T. cruzi L27 ribosomal protein may be involved in autoimmunity of Chagas disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The on-line separation and identification of two important taxonomic markers for plant species of the Paepalanthus genus, the flavonoids 6-methoxykaempferol-3-O-β-D-glucopyranoside and 6-methoxykaempferol-3-O-β-D-6″(p-coumaroyl)glucopyranoside, has been performed with an HPLC-NMR coupling using C30 phase. 1D spectra have been recorded in the stopped-flow mode for the two predominant chromatographic peaks. This is the first application of HPLC-NMR coupling using C30 phase to a taxonomic problem. The technique drastically reduces the required amount of sampling for structure determination. © Springer-Verlag 2000.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an approach for structural health monitoring (SHM) by using adaptive filters. The experimental signals from different structural conditions provided by piezoelectric actuators/sensors bonded in the test structure are modeled by a discrete-time recursive least square (RLS) filter. The biggest advantage to use a RLS filter is the clear possibility to perform an online SHM procedure since that the identification is also valid for non-stationary linear systems. An online damage-sensitive index feature is computed based on autoregressive (AR) portion of coefficients normalized by the square root of the sum of the square of them. The proposed method is then utilized in a laboratory test involving an aeronautical panel coupled with piezoelectric sensors/actuators (PZTs) in different positions. A hypothesis test employing the t-test is used to obtain the damage decision. The proposed algorithm was able to identify and localize the damages simulated in the structure. The results have shown the applicability and drawbacks the method and the paper concludes with suggestions to improve it. ©2010 Society for Experimental Mechanics Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cardiac morphogenesis is a complex process governed by evolutionarily conserved transcription factors and signaling molecules. The Drosophila cardiac tube is linear, made of 52 pairs of cardiomyocytes (CMs), which express specific transcription factor genes that have human homologues implicated in Congenital Heart Diseases (CHDs) (NKX2-5, GATA4 and TBX5). The Drosophila cardiac tube is linear and composed of a rostral portion named aorta and a caudal one called heart, distinguished by morphological and functional differences controlled by Hox genes, key regulators of axial patterning. Overexpression and inactivation of the Hox gene abdominal-A (abd-A), which is expressed exclusively in the heart, revealed that abd-A controls heart identity. The aim of our work is to isolate the heart-specific cisregulatory sequences of abd-A direct target genes, the realizator genes granting heart identity. In each segment of the heart, four pairs of cardiomyocytes (CMs) express tinman (tin), homologous to NKX2-5, and acquire strong contractile and automatic rhythmic activities. By tyramide amplified FISH, we found that seven genes, encoding ion channels, pumps or transporters, are specifically expressed in the Tin-CMs of the heart. We initially used online available tools to identify their heart-specific cisregutatory modules by looking for Conserved Non-coding Sequences containing clusters of binding sites for various cardiac transcription factors, including Hox proteins. Based on these data we generated several reporter gene constructs and transgenic embryos, but none of them showed reporter gene expression in the heart. In order to identify additional abd-A target genes, we performed microarray experiments comparing the transcriptomes of aorta versus heart and identified 144 genes overexpressed in the heart. In order to find the heart-specific cis-regulatory regions of these target genes we developed a new bioinformatic approach where prediction is based on pattern matching and ordered statistics. We first retrieved Conserved Noncoding Sequences from the alignment between the D.melanogaster and D.pseudobscura genomes. We scored for combinations of conserved occurrences of ABD-A, ABD-B, TIN, PNR, dMEF2, MADS box, T-box and E-box sites and we ranked these results based on two independent strategies. On one hand we ranked the putative cis-regulatory sequences according to best scored ABD-A biding sites, on the other hand we scored according to conservation of binding sites. We integrated and ranked again the two lists obtained independently to produce a final rank. We generated nGFP reporter construct flies for in vivo validation. We identified three 1kblong heart-specific enhancers. By in vivo and in vitro experiments we are determining whether they are direct abd-A targets, demonstrating the role of a Hox gene in the realization of heart identity. The identified abd-A direct target genes may be targets also of the NKX2-5, GATA4 and/or TBX5 homologues tin, pannier and Doc genes, respectively. The identification of sequences coregulated by a Hox protein and the homologues of transcription factors causing CHDs, will provide a mean to test whether these factors function as Hox cofactors granting cardiac specificity to Hox proteins, increasing our knowledge on the molecular mechanisms underlying CHDs. Finally, it may be investigated whether these Hox targets are involved in CHDs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]Re-identi fication is commonly accomplished using appearance features based on salient points and color information. In this paper, we make an study on the use of di fferent features exclusively obtained from depth images captured with RGB-D cameras. The results achieved, using simple geometric features extracted in a top-view setup, seem to provide useful descriptors for the re-identi fication task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In dieser Arbeit wurde ein zweidimensionales Kopplungssystem zur Bestimmung von leichtflüchtigen bromierten und iodierten Kohlenwasserstoffen (LHKW) in Wasser- und Luftproben entwickelt. Hierzu wurde ein Gaschromatograph mit einem Elektroneneinfangdetektor (ECD) on-line an ein elementselektives induktiv gekoppeltes Plasma-Massenspektrometer (ICPMS) gekoppelt. Dieses extrem nachweisstarke Analysensystem ermöglicht eine simultane Identifizierung unbekannter und koeluierender Peaks sowie eine vereinfachte Quantifizierung mittels ICPMS. Beim Vergleich des GC-ECD-ICPMS-Kopplungssystem mit den herkömmlichen Detektionsmethoden wie dem Massenspektrometer mit Elektronenstoss-Ionisation und dem Atomemissionsdetektor mit mikrowelleninduziertem Plasma schnitt das neu entwickelte Kopplungssystem ausgezeichnet ab. Für die Isolierung der LHKW aus Meerwasserproben wurde die Purge und Trap Technik verwendet, Luftproben wurden durch Besaugung auf Adsorptionsmaterial angereichert. Im Rahmen des BMBF-Teilprojektes ReHaTrop/AFOHAL wurden im August 2001 und im April/Mai 2002 an der Deutschen Nordseeküste Probenahmen durchgeführt. Die Konzentrationen der Wasserproben lagen im Bereich von 0,1-158 ng L-1, die der Luftproben im Bereich von 0,01-470 pptv. Die Messungen bestätigen die wichtige Rolle von Makroalgen im Zusammenhang mit der Produktion von halogenierten Kohlenwasserstoffen. Die Konzentration der iodierten und bromierten Kohlenwasserstoffe war immer höher in Proben, die direkten Kontakt mit Makroalgen hatten. Inkubationsexperimente zeigen für verschiedene braune und grüne Makroalgen individuelle „Fingerprints“ der biogenen LHKW-Produktion. Bei den Messungen an der Nordseeküste wurden Abhängigkeiten zwischen den LHKW und meteorologischen Parametern gefunden.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die Elementmassenspektrometrie wurde in den letzten Jahren sehr erfolgreich zur Aufklärung verschiedener Fragestellungen in der Bioanalytik eingesetzt. Hierbei spielen vor allem Kopplungstechniken von Trennmethoden wie der Flüssigchromatographie (LC) oder der Kapillarelektrophorese (CE) mit der induktiv gekoppelten Plasma-Massenspektrometrie (ICP-MS) als Multielementdetektor mit hervorragenden Quantifizierungseigenschaften eine entscheidende Rolle bei der Untersuchung von Biopolymeren und deren Wechselwirkung mit verschiedenen Metallen. So wurden beispielsweise verschiedene Methoden für die Trennung und Detektion von Metalloproteinen oder DNA-Metall-Addukten in unterschiedlichen Probenmaterialien entwickelt. Die traditionelle und leistungsstärkste Trennmethode für Biopolymere aller Art, die Gelelektrophorese, wurde jedoch bislang nicht in einem Online-Verfahren an die ICP-MS gekoppelt, um solche Fragestellungen zu bearbeiten. Verschiedene Versuche auf der Basis der Laserablation wurden in diese Richtung unternommen, wobei diese Techniken als sehr umständlich und zeitaufwändig anzusehen sind. In dieser Arbeit wird erstmals die technische Realisierung einer Online-Kopplung der Gelelektrophorese mit der ICP-MS beschrieben. Das System basiert auf einem Prinzip aus der präparativen Gelelektrophorese, in welcher eine kontinuierliche Elution der getrennten Komponenten aus dem Gel während der laufenden Elektrophorese durchgeführt wird. Die eluierten Komponenten werden mit dem Elutionspuffer direkt in das Zerstäubersystem des ICP-MS geführt. Die ersten Untersuchungen wurden am Beispiel der Fragemente von doppelsträngiger DNA (dsDNA) durchgeführt. Kommerziell erhältliche Standardlösungen wurden mit der Online-GE-ICP-MS mittels Detektion von 31P an einem hochauflösenden Massenspektrometer mit einer Massenauflösung von 4000 analysiert. Die Trennbedingungen (z.B. pH-Wert oder Ionenstärke der Pufferlösungen) wurden für die Trennung von dsDNA-Fragementen in Agarosegelen optimiert und auf verschiedene dsDNA-Fragmente angewandt. In einem nächsten Schritt wurden die Quantifizierungsmöglichkeiten für Biopolymere untersucht. Sehr kleine Mengen an dsDNA konnten mit einer Präzision von weniger als 3% quantifiziert werden. Hierfür kamen verschiedene Möglichkeiten der externen Kalibration zum Einsatz, wie der Kalibration mit einem Phosphat-Standard oder einem kommerziell erhältlichen quantitativen dsDNA-Standard. Um das Potenzial der entwickelten Methode für die Untersuchung von Biopolymer-Metall-Wechselwirkungen zu demonstrieren, wurden Oligonukleotide mit Cisplatin unter physiologischen Bedingungen inkubiert und die Reaktionsprodukte mit der Online-GE-ICP-MS mittels 31P- und 195Pt-Detektion untersucht. Verschiedene Cisplatin-Oligonukleotid-Addukte konnten auf diese Weise beobachtet werden, was zur Identifizierung die Anwendung der MALDI-TOF-MS als komplementärer Form der Massenspektrometrie notwendig machte. Abschließend wurde die Isotopenverdünnungsanalyse zum Zweck der Quantifizierung herangezogen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The composition of the atmosphere is frequently perturbed by the emission of gaseous and particulate matter from natural as well as anthropogenic sources. While the impact of trace gases on the radiative forcing of the climate is relatively well understood the role of aerosol is far more uncertain. Therefore, the study of the vertical distribution of particulate matter in the atmosphere and its chemical composition contribute valuable information to bridge this gap of knowledge. The chemical composition of aerosol reveals information on properties such as radiative behavior and hygroscopicity and therefore cloud condensation or ice nucleus potential. rnThis thesis focuses on aerosol pollution plumes observed in 2008 during the POLARCAT (Polar Study using Aircraft, Remote Sensing, Surface Measurements and Models, of Climate, Chemistry, Aerosols, and Transport) campaign over Greenland in June/July and CONCERT (Contrail and Cirrus Experiment) campaign over Central and Western Europe in October/November. Measurements were performed with an Aerodyne compact time-of-flight aerosol mass spectrometer (AMS) capable of online size-resolved chemical characterization of non-refractory submicron particles. In addition, the origins of pollution plumes were determined by means of modeling tools. The characterized pollution episodes originated from a large variety of sources and were encountered at distinct altitudes. They included pure natural emissions from two volcanic eruptions in 2008. By the time of detection over Western Europe between 10 and 12 km altitude the plume was about 3 months old and composed to 71 % of particulate sulfate and 21 % of carbonaceous compounds. Also, biomass burning (BB) plumes were observed over Greenland between 4 and 7 km altitude (free troposphere) originating from Canada and East Siberia. The long-range transport took roughly one and two weeks, respectively. The aerosol was composed of 78 % organic matter and 22 % particulate sulfate. Some Canadian and all Siberian BB plumes were mixed with anthropogenic emissions from fossil fuel combustion (FF) in North America and East Asia. It was found that the contribution of particulate sulfate increased with growing influences from anthropogenic activity and Asia reaching up to 37 % after more than two weeks of transport time. The most exclusively anthropogenic emission source probed in the upper troposphere was engine exhaust from commercial aircraft liners over Germany. However, in-situ characterization of this aerosol type during aircraft chasing was not possible. All long-range transport aerosol was found to have an O:C ratio close to or greater than 1 implying that low-volatility oxygenated organic aerosol was present in each case despite the variety of origins and the large range in age from 3 to 100 days. This leads to the conclusion that organic particulate matter reaches a final and uniform state of oxygenation after at least 3 days in the free troposphere. rnExcept for aircraft exhaust all emission sources mentioned above are surface-bound and thus rely on different types of vertical transport mechanisms, such as direct high altitude injection in the case of a volcanic eruption, or severe BB, or uplift by convection, to reach higher altitudes where particles can travel long distances before removal mainly caused by cloud scavenging. A lifetime for North American mixed BB and FF aerosol of 7 to 11 days was derived. This in consequence means that emission from surface point sources, e.g. volcanoes, or regions, e.g. East Asia, do not only have a relevant impact on the immediate surroundings but rather on a hemispheric scale including such climate sensitive zones as the tropopause or the Arctic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Online reputation management deals with monitoring and influencing the online record of a person, an organization or a product. The Social Web offers increasingly simple ways to publish and disseminate personal or opinionated information, which can rapidly have a disastrous influence on the online reputation of some of the entities. This dissertation can be split into three parts: In the first part, possible fuzzy clustering applications for the Social Semantic Web are investigated. The second part explores promising Social Semantic Web elements for organizational applications,while in the third part the former two parts are brought together and a fuzzy online reputation analysis framework is introduced and evaluated. Theentire PhD thesis is based on literature reviews as well as on argumentative-deductive analyses.The possible applications of Social Semantic Web elements within organizations have been researched using a scenario and an additional case study together with two ancillary case studies—based on qualitative interviews. For the conception and implementation of the online reputation analysis application, a conceptual framework was developed. Employing test installations and prototyping, the essential parts of the framework have been implemented.By following a design sciences research approach, this PhD has created two artifacts: a frameworkand a prototype as proof of concept. Bothartifactshinge on twocoreelements: a (cluster analysis-based) translation of tags used in the Social Web to a computer-understandable fuzzy grassroots ontology for the Semantic Web, and a (Topic Maps-based) knowledge representation system, which facilitates a natural interaction with the fuzzy grassroots ontology. This is beneficial to the identification of unknown but essential Web data that could not be realized through conventional online reputation analysis. Theinherent structure of natural language supports humans not only in communication but also in the perception of the world. Fuzziness is a promising tool for transforming those human perceptions intocomputer artifacts. Through fuzzy grassroots ontologies, the Social Semantic Web becomes more naturally and thus can streamline online reputation management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stepwise uncertainty reduction (SUR) strategies aim at constructing a sequence of points for evaluating a function  f in such a way that the residual uncertainty about a quantity of interest progressively decreases to zero. Using such strategies in the framework of Gaussian process modeling has been shown to be efficient for estimating the volume of excursion of f above a fixed threshold. However, SUR strategies remain cumbersome to use in practice because of their high computational complexity, and the fact that they deliver a single point at each iteration. In this article we introduce several multipoint sampling criteria, allowing the selection of batches of points at which f can be evaluated in parallel. Such criteria are of particular interest when f is costly to evaluate and several CPUs are simultaneously available. We also manage to drastically reduce the computational cost of these strategies through the use of closed form formulas. We illustrate their performances in various numerical experiments, including a nuclear safety test case. Basic notions about kriging, auxiliary problems, complexity calculations, R code, and data are available online as supplementary materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este Trabajo de Fin de Grado (TFG) tiene el objetivo de aportar un sistema de enseñanza innovador, un sistema de enseñanza mediante el cual se consiga involucrar a los alumnos en tareas y prácticas en las que se adquieran conocimientos a la vez que se siente un ambiente de juego, es decir, que se consiga aprender de forma divertida. Está destinado al sistema educativo de la Escuela Técnica Superior de Ingenieros Informáticos de la Universidad Politécnica de Madrid, en concreto a las asignaturas relacionadas con los Procesadores de Lenguajes. La aplicación desarrollada en este trabajo está destinada tanto a los profesores de las asignaturas de Procesadores de Lenguajes como a los alumnos que tengan alguna relación con esas asignaturas, consiguiendo mayor interacción y diversión a la hora de realizar la tareas y prácticas de las asignaturas. Para los dos tipos de usuarios descritos anteriormente, la aplicación está configurada para que puedan identificarse mediante sus credenciales, comprobándose si los datos introducidos son correctos, y así poder acceder al sistema. Dependiendo de qué tipo de usuario se identifique, tendrá unas opciones u otras dentro del sistema. Los profesores podrán dar de alta, ver, modificar o dar de baja las configuraciones para los analizadores de los lenguajes correspondientes a las diferentes asignaturas que están configurados previamente en el sistema. Además, los profesores pueden dar de alta, ver, modificar o dar de baja los fragmentos de código que formarán los ficheros correspondientes a las plantillas de pruebas del analizador léxico que se les ofrece a los alumnos para realizar comprobaciones de las prácticas. Mediante la aplicación podrán establecer diferentes características y propiedades de los fragmentos que incorporen al sistema. Por otra parte, los alumnos podrán realizar la configuración del lenguaje, definido por los profesores, para la parte del analizador léxico de las prácticas. Esta configuración será guardada para el grupo al que corresponde el alumno, pudiendo realizar modificaciones cualquier miembro del grupo. De esta manera, se podrán posteriormente establecer las relaciones necesarias entre los elementos del lenguaje según la configuración de los profesores y los elementos referentes a las prácticas de los alumnos.Además, los alumnos podrán realizar comprobaciones de la parte léxica de sus prácticas mediante los ficheros que se generan por el sistema en función de sus opciones de práctica y los fragmentos añadidos por los profesores. De esta manera, se informará a los alumnos del éxito de las pruebas o bien de los fallos ocasionados con sus resultados, bien por el formato del archivo subido como resultado de la prueba o bien por el contenido incorrecto de este mismo. Todas las funciones que ofrece esta aplicación son completamente on-line y tendrán una interfaz llamativa y divertida, además de caracterizarse por su facilidad de uso y su comodidad. En el trabajo realizado para este proyecto se cumplen tanto las Pautas de Accesibilidad para Contenidos Web (WCAG 2.0), así como las propiedades de un código HTML 5 y CSS 3 de manera correcta, para así conseguir que los usuarios utilicen una aplicación fácil, cómoda y atractiva.---ABSTRACT---This Final Year Project (TFG) aims to contribute the educational system of the School of Computer Engineering at the Polytechnic University of Madrid, especially in subjects related with Language Processors. This project is an interactive learning system whose goal is to learn in an amusing environment. To realize this target, the system involves students, using environments of games in tasks and practices. The application developed in this project is designed for both professors of the subjects of Language Processors and students who have some relation to these subjects. This perspective achieve more interaction and a funny environment during the subject‘s tasks. The application is configured in order to the users can be identified by their credentials, checking whether the identification data are correct to have access to the system. According on what type of user is identified, they will have different options within the system. Professors will be able to register, modify or delete settings for the scanner of languages for all the subjects preconfigured in the system. Additionally, professors can register, show, modify or remove the code of the templates from scanner tests that are offered to students for testing the practical exercises. The professors may provide also different characteristics and properties of fragments incorporated in the system. Moreover, students can make the configuration of languages, getting in the systems by the administrators, for the scanner module of their practical exercises. This configuration will be saved for the group of the student. This model can also be changed by any group member. The system permits also establish later the relationships between the elements of language fixes by professors and elements developed by the students. Students could check the lexical part of their practical exercises through files that are created according to their practical options and the fragments added by professors. Thus students will be informed of success or failure in the uploaded files format and in the content of them. All functions provide by this application are completely on-line and will have a striking and funny interface, also characterized by its ease of use and comfort.The work reaches both the Web Content Accessibility Guidelines (WCAG 2.0), and the properties of an HTML 5 and CSS 3 code correctly, in order to get the users to get an easy, convenient application and attractive.