867 resultados para Graph-based method
Innovative analytical strategies for the development of sensor devices and mass spectrometry methods
Resumo:
Il lavoro presentato in questa tesi di Dottorato è incentrato sullo sviluppo di strategie analitiche innovative basate sulla sensoristica e su tecniche di spettrometria di massa in ambito biologico e della sicurezza alimentare. Il primo capitolo tratta lo studio di aspetti metodologici ed applicativi di procedure sensoristiche per l’identificazione e la determinazione di biomarkers associati alla malattia celiaca. In tale ambito, sono stati sviluppati due immunosensori, uno a trasduzione piezoelettrica e uno a trasduzione amperometrica, per la rivelazione di anticorpi anti-transglutaminasi tissutale associati a questa malattia. L’innovazione di questi dispositivi riguarda l’immobilizzazione dell’enzima tTG nella conformazione aperta (Open-tTG), che è stato dimostrato essere quella principalmente coinvolta nella patogenesi. Sulla base dei risultati ottenuti, entrambi i sistemi sviluppati si sono dimostrati una valida alternativa ai test di screening attualmente in uso per la diagnosi della celiachia. Rimanendo sempre nel contesto della malattia celiaca, ulteriore ricerca oggetto di questa tesi di Dottorato, ha riguardato lo sviluppo di metodi affidabili per il controllo di prodotti “gluten-free”. Il secondo capitolo tratta lo sviluppo di un metodo di spettrometria di massa e di un immunosensore competitivo per la rivelazione di prolammine in alimenti “gluten-free”. E’ stato sviluppato un metodo LC-ESI-MS/MS basato su un’analisi target con modalità di acquisizione del segnale selected reaction monitoring per l’identificazione di glutine in diversi cereali potenzialmente tossici per i celiaci. Inoltre ci si è focalizzati su un immunosensore competitivo per la rivelazione di gliadina, come metodo di screening rapido di farine. Entrambi i sistemi sono stati ottimizzati impiegando miscele di farina di riso addizionata di gliadina, avenine, ordeine e secaline nel caso del sistema LC-MS/MS e con sola gliadina nel caso del sensore. Infine i sistemi analitici sono stati validati analizzando sia materie prime (farine) che alimenti (biscotti, pasta, pane, etc.). L’approccio sviluppato in spettrometria di massa apre la strada alla possibilità di sviluppare un test di screening multiplo per la valutazione della sicurezza di prodotti dichiarati “gluten-free”, mentre ulteriori studi dovranno essere svolti per ricercare condizioni di estrazione compatibili con l’immunosaggio competitivo, per ora applicabile solo all’analisi di farine estratte con etanolo. Terzo capitolo di questa tesi riguarda lo sviluppo di nuovi metodi per la rivelazione di HPV, Chlamydia e Gonorrhoeae in fluidi biologici. Si è scelto un substrato costituito da strips di carta in quanto possono costituire una valida piattaforma di rivelazione, offrendo vantaggi grazie al basso costo, alla possibilità di generare dispositivi portatili e di poter visualizzare il risultato visivamente senza la necessità di strumentazioni. La metodologia sviluppata è molto semplice, non prevede l’uso di strumentazione complessa e si basa sull’uso della isothermal rolling-circle amplification per l’amplificazione del target. Inoltre, di fondamentale importanza, è l’utilizzo di nanoparticelle colorate che, essendo state funzionalizzate con una sequenza di DNA complementare al target amplificato derivante dalla RCA, ne permettono la rivelazione a occhio nudo mediante l’uso di filtri di carta. Queste strips sono state testate su campioni reali permettendo una discriminazione tra campioni positivi e negativi in tempi rapidi (10-15 minuti), aprendo una nuova via verso nuovi test altamente competitivi con quelli attualmente sul mercato.
Resumo:
Este trabalho apresenta um método de estimativa de torque do joelho baseado em sinais eletromiográficos (EMG) durante terapia de reabilitação robótica. Os EMGs, adquiridos de cinco músculos envolvidos no movimento de flexão e extensão do joelho, são processados para encontrar as ativações musculares. Em seguida, mediante um modelo simples de contração muscular, são calculadas as forças e, usando a geometria da articulação, o torque do joelho. As funções de ativação e contração musculares possuem parâmetros limitados que devem ser calibrados para cada usuário, sendo o ajuste feito mediante a minimização do erro entre o torque estimado e o torque medido na articulação usando a dinâmica inversa. São comparados dois métodos iterativos para funções não-lineares como técnicas de otimização restrita para a calibração dos parâmetros: Gradiente Descendente e Quasi-Newton. O processamento de sinais, calibração de parâmetros e cálculo de torque estimado foram desenvolvidos no software MATLAB®; o cálculo de torque medido foi feito no software OpenSim com sua ferramenta de dinâmica inversa.
Resumo:
A eficiência e a racionalidade energética da iluminação pública têm relevante importância no sistema elétrico, porque contribui para diminuir a necessidade de investimentos na construção de novas fontes geradoras de energia elétrica e nos desperdícios energéticos. Apresenta-se como objetivo deste trabalho de pesquisa o desenvolvimento e aplicação do IDE (índice de desempenho energético), fundamentado no sistema de inferência nebulosa e indicadores de eficiência e racionalidade de uso da energia elétrica. A opção em utilizar a inferência nebulosa deve-se aos fatos de sua capacidade de reproduzir parte do raciocínio humano, e estabelecer relação entre a diversidade de indicadores envolvidos. Para a consecução do sistema de inferência nebulosa, foram definidas como variáveis de entrada: os indicadores de eficiência e racionalidade; o método de inferência foi baseado em regras produzidas por especialista em iluminação pública, e como saída um número real que caracteriza o IDE. Os indicadores de eficiência e racionalidade são divididos em duas classes: globais e específicos. Os indicadores globais são: FP (fator de potência), FC (fator de carga) e FD (fator de demanda). Os indicadores específicos são: FU (fator de utilização), ICA (consumo de energia por área iluminada), IE (intensidade energética) e IL (intensidade de iluminação natural). Para a aplicação deste trabalho, foi selecionada e caracterizada a iluminação pública da Cidade Universitária \"Armando de Salles Oliveira\" da Universidade de São Paulo. Sendo assim, o gestor do sistema de iluminação, a partir do índice desenvolvido neste trabalho, dispõe de condições para avaliar o uso da energia elétrica e, desta forma, elaborar e simular estratégias com o objetivo de economizá-la.
Resumo:
Póster presentado en SPIE Photonics Europe, Brussels, 16-19 April 2012.
Resumo:
En este trabajo se presenta un método para la detección de subjetividad a nivel de oraciones basado en la desambiguación subjetiva del sentido de las palabras. Para ello se extiende un método de desambiguación semántica basado en agrupamiento de sentidos para determinar cuándo las palabras dentro de la oración están siendo utilizadas de forma subjetiva u objetiva. En nuestra propuesta se utilizan recursos semánticos anotados con valores de polaridad y emociones para determinar cuándo un sentido de una palabra puede ser considerado subjetivo u objetivo. Se presenta un estudio experimental sobre la detección de subjetividad en oraciones, en el cual se consideran las colecciones del corpus MPQA y Movie Review Dataset, así como los recursos semánticos SentiWordNet, Micro-WNOp y WordNet-Affect. Los resultados obtenidos muestran que nuestra propuesta contribuye de manera significativa en la detección de subjetividad.
Resumo:
En este trabajo presentamos unos resultados preliminares obtenidos mediante la aplicación de una nueva técnica de construcción de grafos semánticos a la tarea de desambiguación del sentido de las palabras en un entorno multilingüe. Gracias al uso de esta técnica no supervisada, inducimos los sentidos asociados a las traducciones de la palabra ambigua considerada en la lengua destino. Utilizamos las traducciones de las palabras del contexto de la palabra ambigua en la lengua origen para seleccionar el sentido más probable de la traducción. El sistema ha sido evaluado sobre la colección de datos de una tarea de desambiguación multilingüe que se propuso en la competición SemEval-2010, consiguiendo superar los resultados de todos los sistemas no supervisados que participaron en aquella tarea.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
In simultaneous analyses of multiple data partitions, the trees relevant when measuring support for a clade are the optimal tree, and the best tree lacking the clade (i.e., the most reasonable alternative). The parsimony-based method of partitioned branch support (PBS) forces each data set to arbitrate between the two relevant trees. This value is the amount each data set contributes to clade support in the combined analysis, and can be very different to support apparent in separate analyses. The approach used in PBS can also be employed in likelihood: a simultaneous analysis of all data retrieves the maximum likelihood tree, and the best tree without the clade of interest is also found. Each data set is fitted to the two trees and the log-likelihood difference calculated, giving partitioned likelihood support (PLS) for each data set. These calculations can be performed regardless of the complexity of the ML model adopted. The significance of PLS can be evaluated using a variety of resampling methods, such as the Kishino-Hasegawa test, the Shimodiara-Hasegawa test, or likelihood weights, although the appropriateness and assumptions of these tests remains debated.
Resumo:
Inferring the spatial expansion dynamics of invading species from molecular data is notoriously difficult due to the complexity of the processes involved. For these demographic scenarios, genetic data obtained from highly variable markers may be profitably combined with specific sampling schemes and information from other sources using a Bayesian approach. The geographic range of the introduced toad Bufo marinus is still expanding in eastern and northern Australia, in each case from isolates established around 1960. A large amount of demographic and historical information is available on both expansion areas. In each area, samples were collected along a transect representing populations of different ages and genotyped at 10 microsatellite loci. Five demographic models of expansion, differing in the dispersal pattern for migrants and founders and in the number of founders, were considered. Because the demographic history is complex, we used an approximate Bayesian method, based on a rejection-regression algorithm. to formally test the relative likelihoods of the five models of expansion and to infer demographic parameters. A stepwise migration-foundation model with founder events was statistically better supported than other four models in both expansion areas. Posterior distributions supported different dynamics of expansion in the studied areas. Populations in the eastern expansion area have a lower stable effective population size and have been founded by a smaller number of individuals than those in the northern expansion area. Once demographically stabilized, populations exchange a substantial number of effective migrants per generation in both expansion areas, and such exchanges are larger in northern than in eastern Australia. The effective number of migrants appears to be considerably lower than that of founders in both expansion areas. We found our inferences to be relatively robust to various assumptions on marker. demographic, and historical features. The method presented here is the only robust, model-based method available so far, which allows inferring complex population dynamics over a short time scale. It also provides the basis for investigating the interplay between population dynamics, drift, and selection in invasive species.
Resumo:
An emerging public health phenomenon is the increasing incidence of methicillin-resistant Staphylococcus aureus (MRSA) infections that are acquired outside of health care facilities. One lineage of community-acquired MRSA (CA-MRSA) is known as the Western Samoan phage pattern (WSPP) clone. The central aim of this study was to develop an efficient genotyping procedure for the identification of WSPP isolates. The approach taken was to make use of the highly variable region downstream of mecA in combination with a single nucleotide polymorphism (SNP) defined by the S. aureus multilocus sequence typing (MLST) database. The premise was that a combinatorial genotyping method that interrogated both a highly variable region and the genomic backbone would deliver a high degree of informative power relative to the number of genetic polymorphisms-interrogated. Thirty-five MRSA isolates were used for this study, and their gene contents and order downstream of mecA were determined. The CA-MRSA isolates were found to contain a truncated mecA downstream region consisting of mecA-HVR-IS431 mec-dcs-Ins117, and a PCR-based method for identifying this structure was developed. The hospital-acquired isolates were found to contain eight different mecA downstream regions, three of which were novel. The Minimum SNPs computer software program was used to mine the S. aureus MLST database, and the arcC 2726 polymorph was identified as 82% discriminatory for ST-30. A real-time PCR assay was developed to interrogate this SNP. We found that the assay for the truncated mecA downstream region in combination with the interrogation of arcC position 272 provided an unambiguous identification of WSPP isolates.
Resumo:
Organosilica microspheres synthesised via a novel surfactant-free emulsion-based method show applicability towards optical encoding, solid-phase synthesis and high-throughput screening of bound oligonucleotide and peptide sequences.
Resumo:
In this paper, numerical simulations are used in an attempt to find optimal Source profiles for high frequency radiofrequency (RF) volume coils. Biologically loaded, shielded/unshielded circular and elliptical birdcage coils operating at 170 MHz, 300 MHz and 470 MHz are modelled using the FDTD method for both 2D and 3D cases. Taking advantage of the fact that some aspects of the electromagnetic system are linear, two approaches have been proposed for the determination of the drives for individual elements in the RF resonator. The first method is an iterative optimization technique with a kernel for the evaluation of RF fields inside an imaging plane of a human head model using pre-characterized sensitivity profiles of the individual rungs of a resonator; the second method is a regularization-based technique. In the second approach, a sensitivity matrix is explicitly constructed and a regularization procedure is employed to solve the ill-posed problem. Test simulations show that both methods can improve the B-1-field homogeneity in both focused and non-focused scenarios. While the regularization-based method is more efficient, the first optimization method is more flexible as it can take into account other issues such as controlling SAR or reshaping the resonator structures. It is hoped that these schemes and their extensions will be useful for the determination of multi-element RF drives in a variety of applications.
Resumo:
With the rapid increase in both centralized video archives and distributed WWW video resources, content-based video retrieval is gaining its importance. To support such applications efficiently, content-based video indexing must be addressed. Typically, each video is represented by a sequence of frames. Due to the high dimensionality of frame representation and the large number of frames, video indexing introduces an additional degree of complexity. In this paper, we address the problem of content-based video indexing and propose an efficient solution, called the Ordered VA-File (OVA-File) based on the VA-file. OVA-File is a hierarchical structure and has two novel features: 1) partitioning the whole file into slices such that only a small number of slices are accessed and checked during k Nearest Neighbor (kNN) search and 2) efficient handling of insertions of new vectors into the OVA-File, such that the average distance between the new vectors and those approximations near that position is minimized. To facilitate a search, we present an efficient approximate kNN algorithm named Ordered VA-LOW (OVA-LOW) based on the proposed OVA-File. OVA-LOW first chooses possible OVA-Slices by ranking the distances between their corresponding centers and the query vector, and then visits all approximations in the selected OVA-Slices to work out approximate kNN. The number of possible OVA-Slices is controlled by a user-defined parameter delta. By adjusting delta, OVA-LOW provides a trade-off between the query cost and the result quality. Query by video clip consisting of multiple frames is also discussed. Extensive experimental studies using real video data sets were conducted and the results showed that our methods can yield a significant speed-up over an existing VA-file-based method and iDistance with high query result quality. Furthermore, by incorporating temporal correlation of video content, our methods achieved much more efficient performance.
Resumo:
The Gauss-Marquardt-Levenberg (GML) method of computer-based parameter estimation, in common with other gradient-based approaches, suffers from the drawback that it may become trapped in local objective function minima, and thus report optimized parameter values that are not, in fact, optimized at all. This can seriously degrade its utility in the calibration of watershed models where local optima abound. Nevertheless, the method also has advantages, chief among these being its model-run efficiency, and its ability to report useful information on parameter sensitivities and covariances as a by-product of its use. It is also easily adapted to maintain this efficiency in the face of potential numerical problems (that adversely affect all parameter estimation methodologies) caused by parameter insensitivity and/or parameter correlation. The present paper presents two algorithmic enhancements to the GML method that retain its strengths, but which overcome its weaknesses in the face of local optima. Using the first of these methods an intelligent search for better parameter sets is conducted in parameter subspaces of decreasing dimensionality when progress of the parameter estimation process is slowed either by numerical instability incurred through problem ill-posedness, or when a local objective function minimum is encountered. The second methodology minimizes the chance of successive GML parameter estimation runs finding the same objective function minimum by starting successive runs at points that are maximally removed from previous parameter trajectories. As well as enhancing the ability of a GML-based method to find the global objective function minimum, the latter technique can also be used to find the locations of many non-global optima (should they exist) in parameter space. This can provide a useful means of inquiring into the well-posedness of a parameter estimation problem, and for detecting the presence of bimodal parameter and predictive probability distributions. The new methodologies are demonstrated by calibrating a Hydrological Simulation Program-FORTRAN (HSPF) model against a time series of daily flows. Comparison with the SCE-UA method in this calibration context demonstrates a high level of comparative model run efficiency for the new method. (c) 2006 Elsevier B.V. All rights reserved.