795 resultados para label hierarchical clustering


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Antibody microarrays are of great research interest because of their potential application as biosensors for high-throughput protein and pathogen screening technologies. In this active area, there is still a need for novel structures and assemblies providing insight in binding interactions such as spherical and annulus-shaped protein structures, e.g. for the utilization of curved surfaces for the enhanced protein-protein interactions and detection of antigens. Therefore, the goal of the presented work was to establish a new technique for the label-free detection of bio-molecules and bacteria on topographically structured surfaces, suitable for antibody binding.rnIn the first part of the presented thesis, the fabrication of monolayers of inverse opals with 10 μm diameter and the immobilization of antibodies on their interior surface is described. For this purpose, several established methods for the linking of antibodies to glass, including Schiff bases, EDC/S-NHS chemistry and the biotin-streptavidin affinity system, were tested. The employed methods included immunofluorescence and image analysis by phase contrast microscopy. It could be shown that these methods were not successful in terms of antibody immobilization and adjacent bacteria binding. Hence, a method based on the application of an active-ester-silane was introduced. It showed promising results but also the need for further analysis. Especially the search for alternative antibodies addressing other antigens on the exterior of bacteria will be sought-after in the future.rnAs a consequence of the ability to control antibody-functionalized surfaces, a new technique employing colloidal templating to yield large scale (~cm2) 2D arrays of antibodies against E. coli K12, eGFP and human integrin αvβ3 on a versatile useful glass surface is presented. The antibodies were swept to reside around the templating microspheres during solution drying, and physisorbed on the glass. After removing the microspheres, the formation of annuli-shaped antibody structures was observed. The preserved antibody structure and functionality is shown by binding the specific antigens and secondary antibodies. The improved detection of specific bacteria from a crude solution compared to conventional “flat” antibody surfaces and the setting up of an integrin-binding platform for targeted recognition and surface interactions of eukaryotic cells is demonstrated. The structures were investigated by atomic force, confocal and fluorescence microscopy. Operational parameters like drying time, temperature, humidity and surfactants were optimized to obtain a stable antibody structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lo scopo del clustering è quindi quello di individuare strutture nei dati significative, ed è proprio dalla seguente definizione che è iniziata questa attività di tesi , fornendo un approccio innovativo ed inesplorato al cluster, ovvero non ricercando la relazione ma ragionando su cosa non lo sia. Osservando un insieme di dati ,cosa rappresenta la non relazione? Una domanda difficile da porsi , che ha intrinsecamente la sua risposta, ovvero l’indipendenza di ogni singolo dato da tutti gli altri. La ricerca quindi dell’indipendenza tra i dati ha portato il nostro pensiero all’approccio statistico ai dati , in quanto essa è ben descritta e dimostrata in statistica. Ogni punto in un dataset, per essere considerato “privo di collegamenti/relazioni” , significa che la stessa probabilità di essere presente in ogni elemento spaziale dell’intero dataset. Matematicamente parlando , ogni punto P in uno spazio S ha la stessa probabilità di cadere in una regione R ; il che vuol dire che tale punto può CASUALMENTE essere all’interno di una qualsiasi regione del dataset. Da questa assunzione inizia il lavoro di tesi, diviso in più parti. Il secondo capitolo analizza lo stato dell’arte del clustering, raffrontato alla crescente problematica della mole di dati, che con l’avvento della diffusione della rete ha visto incrementare esponenzialmente la grandezza delle basi di conoscenza sia in termini di attributi (dimensioni) che in termini di quantità di dati (Big Data). Il terzo capitolo richiama i concetti teorico-statistici utilizzati dagli algoritimi statistici implementati. Nel quarto capitolo vi sono i dettagli relativi all’implementazione degli algoritmi , ove sono descritte le varie fasi di investigazione ,le motivazioni sulle scelte architetturali e le considerazioni che hanno portato all’esclusione di una delle 3 versioni implementate. Nel quinto capitolo gli algoritmi 2 e 3 sono confrontati con alcuni algoritmi presenti in letteratura, per dimostrare le potenzialità e le problematiche dell’algoritmo sviluppato , tali test sono a livello qualitativo , in quanto l’obbiettivo del lavoro di tesi è dimostrare come un approccio statistico può rivelarsi un’arma vincente e non quello di fornire un nuovo algoritmo utilizzabile nelle varie problematiche di clustering. Nel sesto capitolo saranno tratte le conclusioni sul lavoro svolto e saranno elencati i possibili interventi futuri dai quali la ricerca appena iniziata del clustering statistico potrebbe crescere.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Atmosphärische Aerosolpartikel wirken in vielerlei Hinsicht auf die Menschen und die Umwelt ein. Eine genaue Charakterisierung der Partikel hilft deren Wirken zu verstehen und dessen Folgen einzuschätzen. Partikel können hinsichtlich ihrer Größe, ihrer Form und ihrer chemischen Zusammensetzung charakterisiert werden. Mit der Laserablationsmassenspektrometrie ist es möglich die Größe und die chemische Zusammensetzung einzelner Aerosolpartikel zu bestimmen. Im Rahmen dieser Arbeit wurde das SPLAT (Single Particle Laser Ablation Time-of-flight mass spectrometer) zur besseren Analyse insbesondere von atmosphärischen Aerosolpartikeln weiterentwickelt. Der Aerosoleinlass wurde dahingehend optimiert, einen möglichst weiten Partikelgrößenbereich (80 nm - 3 µm) in das SPLAT zu transferieren und zu einem feinen Strahl zu bündeln. Eine neue Beschreibung für die Beziehung der Partikelgröße zu ihrer Geschwindigkeit im Vakuum wurde gefunden. Die Justage des Einlasses wurde mithilfe von Schrittmotoren automatisiert. Die optische Detektion der Partikel wurde so verbessert, dass Partikel mit einer Größe < 100 nm erfasst werden können. Aufbauend auf der optischen Detektion und der automatischen Verkippung des Einlasses wurde eine neue Methode zur Charakterisierung des Partikelstrahls entwickelt. Die Steuerelektronik des SPLAT wurde verbessert, so dass die maximale Analysefrequenz nur durch den Ablationslaser begrenzt wird, der höchsten mit etwa 10 Hz ablatieren kann. Durch eine Optimierung des Vakuumsystems wurde der Ionenverlust im Massenspektrometer um den Faktor 4 verringert.rnrnNeben den hardwareseitigen Weiterentwicklungen des SPLAT bestand ein Großteil dieser Arbeit in der Konzipierung und Implementierung einer Softwarelösung zur Analyse der mit dem SPLAT gewonnenen Rohdaten. CRISP (Concise Retrieval of Information from Single Particles) ist ein auf IGOR PRO (Wavemetrics, USA) aufbauendes Softwarepaket, das die effiziente Auswertung der Einzelpartikel Rohdaten erlaubt. CRISP enthält einen neu entwickelten Algorithmus zur automatischen Massenkalibration jedes einzelnen Massenspektrums, inklusive der Unterdrückung von Rauschen und von Problemen mit Signalen die ein intensives Tailing aufweisen. CRISP stellt Methoden zur automatischen Klassifizierung der Partikel zur Verfügung. Implementiert sind k-means, fuzzy-c-means und eine Form der hierarchischen Einteilung auf Basis eines minimal aufspannenden Baumes. CRISP bietet die Möglichkeit die Daten vorzubehandeln, damit die automatische Einteilung der Partikel schneller abläuft und die Ergebnisse eine höhere Qualität aufweisen. Daneben kann CRISP auf einfache Art und Weise Partikel anhand vorgebener Kriterien sortieren. Die CRISP zugrundeliegende Daten- und Infrastruktur wurde in Hinblick auf Wartung und Erweiterbarkeit erstellt. rnrnIm Rahmen der Arbeit wurde das SPLAT in mehreren Kampagnen erfolgreich eingesetzt und die Fähigkeiten von CRISP konnten anhand der gewonnen Datensätze gezeigt werden.rnrnDas SPLAT ist nun in der Lage effizient im Feldeinsatz zur Charakterisierung des atmosphärischen Aerosols betrieben zu werden, während CRISP eine schnelle und gezielte Auswertung der Daten ermöglicht.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Moderne ESI-LC-MS/MS-Techniken erlauben in Verbindung mit Bottom-up-Ansätzen eine qualitative und quantitative Charakterisierung mehrerer tausend Proteine in einem einzigen Experiment. Für die labelfreie Proteinquantifizierung eignen sich besonders datenunabhängige Akquisitionsmethoden wie MSE und die IMS-Varianten HDMSE und UDMSE. Durch ihre hohe Komplexität stellen die so erfassten Daten besondere Anforderungen an die Analysesoftware. Eine quantitative Analyse der MSE/HDMSE/UDMSE-Daten blieb bislang wenigen kommerziellen Lösungen vorbehalten. rn| In der vorliegenden Arbeit wurden eine Strategie und eine Reihe neuer Methoden zur messungsübergreifenden, quantitativen Analyse labelfreier MSE/HDMSE/UDMSE-Daten entwickelt und als Software ISOQuant implementiert. Für die ersten Schritte der Datenanalyse (Featuredetektion, Peptid- und Proteinidentifikation) wird die kommerzielle Software PLGS verwendet. Anschließend werden die unabhängigen PLGS-Ergebnisse aller Messungen eines Experiments in einer relationalen Datenbank zusammengeführt und mit Hilfe der dedizierten Algorithmen (Retentionszeitalignment, Feature-Clustering, multidimensionale Normalisierung der Intensitäten, mehrstufige Datenfilterung, Proteininferenz, Umverteilung der Intensitäten geteilter Peptide, Proteinquantifizierung) überarbeitet. Durch diese Nachbearbeitung wird die Reproduzierbarkeit der qualitativen und quantitativen Ergebnisse signifikant gesteigert.rn| Um die Performance der quantitativen Datenanalyse zu evaluieren und mit anderen Lösungen zu vergleichen, wurde ein Satz von exakt definierten Hybridproteom-Proben entwickelt. Die Proben wurden mit den Methoden MSE und UDMSE erfasst, mit Progenesis QIP, synapter und ISOQuant analysiert und verglichen. Im Gegensatz zu synapter und Progenesis QIP konnte ISOQuant sowohl eine hohe Reproduzierbarkeit der Proteinidentifikation als auch eine hohe Präzision und Richtigkeit der Proteinquantifizierung erreichen.rn| Schlussfolgernd ermöglichen die vorgestellten Algorithmen und der Analyseworkflow zuverlässige und reproduzierbare quantitative Datenanalysen. Mit der Software ISOQuant wurde ein einfaches und effizientes Werkzeug für routinemäßige Hochdurchsatzanalysen labelfreier MSE/HDMSE/UDMSE-Daten entwickelt. Mit den Hybridproteom-Proben und den Bewertungsmetriken wurde ein umfassendes System zur Evaluierung quantitativer Akquisitions- und Datenanalysesysteme vorgestellt.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, Deep Learning techniques have shown to perform well on a large variety of problems both in Computer Vision and Natural Language Processing, reaching and often surpassing the state of the art on many tasks. The rise of deep learning is also revolutionizing the entire field of Machine Learning and Pattern Recognition pushing forward the concepts of automatic feature extraction and unsupervised learning in general. However, despite the strong success both in science and business, deep learning has its own limitations. It is often questioned if such techniques are only some kind of brute-force statistical approaches and if they can only work in the context of High Performance Computing with tons of data. Another important question is whether they are really biologically inspired, as claimed in certain cases, and if they can scale well in terms of "intelligence". The dissertation is focused on trying to answer these key questions in the context of Computer Vision and, in particular, Object Recognition, a task that has been heavily revolutionized by recent advances in the field. Practically speaking, these answers are based on an exhaustive comparison between two, very different, deep learning techniques on the aforementioned task: Convolutional Neural Network (CNN) and Hierarchical Temporal memory (HTM). They stand for two different approaches and points of view within the big hat of deep learning and are the best choices to understand and point out strengths and weaknesses of each of them. CNN is considered one of the most classic and powerful supervised methods used today in machine learning and pattern recognition, especially in object recognition. CNNs are well received and accepted by the scientific community and are already deployed in large corporation like Google and Facebook for solving face recognition and image auto-tagging problems. HTM, on the other hand, is known as a new emerging paradigm and a new meanly-unsupervised method, that is more biologically inspired. It tries to gain more insights from the computational neuroscience community in order to incorporate concepts like time, context and attention during the learning process which are typical of the human brain. In the end, the thesis is supposed to prove that in certain cases, with a lower quantity of data, HTM can outperform CNN.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the field of computer assisted orthopedic surgery (CAOS) the anterior pelvic plane (APP) is a common concept to determine the pelvic orientation by digitizing distinct pelvic landmarks. As percutaneous palpation is - especially for obese patients - known to be error-prone, B-mode ultrasound (US) imaging could provide an alternative means. Several concepts of using ultrasound imaging to determine the APP landmarks have been introduced. In this paper we present a novel technique, which uses local patch statistical shape models (SSMs) and a hierarchical speed of sound compensation strategy for an accurate determination of the APP. These patches are independently matched and instantiated with respect to associated point clouds derived from the acquired ultrasound images. Potential inaccuracies due to the assumption of a constant speed of sound are compensated by an extended reconstruction scheme. We validated our method with in-vitro studies using a plastic bone covered with a soft-tissue simulation phantom and with a preliminary cadaver trial.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Children who experience adverse reactions to cow's milk or who have diseases predisposing them to low bone mass are often prescribed a supplementation of calcium and vitamin D(3), but adherence can be poor. Age-specific preferences for different formulations may exist and at least partially explain poor compliance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intravenous immunoglobulin (IVIG) is the first-line therapy for multifocal motor neuropathy (MMN). This open-label multi-centre study (NCT00701662) assessed the efficacy, safety, and convenience of subcutaneous immunoglobulin (SCIG) in patients with MMN over 6 months, as an alternative to IVIG. Eight MMN patients (42-66 years), on stable IVIG dosing, received weekly SCIG at doses equivalent to previous IVIG using a "smooth transition protocol". Primary efficacy endpoint was the change from baseline to week 24 in muscle strength. Disability, motor function, and health-related quality of life (HRQL) endpoints were also assessed. One patient deteriorated despite dose increase and was withdrawn. Muscle strength, disability, motor function, and health status were unchanged in all seven study completers who rated home treatment as extremely good. Four experienced 18 adverse events, of which only two were moderate. This study suggests that MMN patients with stable clinical course on regular IVIG can be switched to SCIG at the same monthly dose without deterioration and with a sustained overall improvement in HRQL.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, enamel matrix derivative (EMD) has garnered much interest in the dental field for its apparent bioactivity that stimulates regeneration of periodontal tissues including periodontal ligament, cementum and alveolar bone. Despite its widespread use, the underlying cellular mechanisms remain unclear and an understanding of its biological interactions could identify new strategies for tissue engineering. Previous in vitro research has demonstrated that EMD promotes premature osteoblast clustering at early time points. The aim of the present study was to evaluate the influence of cell clustering on vital osteoblast cell-cell communication and adhesion molecules, connexin 43 (cx43) and N-cadherin (N-cad) as assessed by immunofluorescence imaging, real-time PCR and Western blot analysis. In addition, differentiation markers of osteoblasts were quantified using alkaline phosphatase, osteocalcin and von Kossa staining. EMD significantly increased the expression of connexin 43 and N-cadherin at early time points ranging from 2 to 5 days. Protein expression was localized to cell membranes when compared to control groups. Alkaline phosphatase activity was also significantly increased on EMD-coated samples at 3, 5 and 7 days post seeding. Interestingly, higher activity was localized to cell cluster regions. There was a 3 fold increase in osteocalcin and bone sialoprotein mRNA levels for osteoblasts cultured on EMD-coated culture dishes. Moreover, EMD significantly increased extracellular mineral deposition in cell clusters as assessed through von Kossa staining at 5, 7, 10 and 14 days post seeding. We conclude that EMD up-regulates the expression of vital osteoblast cell-cell communication and adhesion molecules, which enhances the differentiation and mineralization activity of osteoblasts. These findings provide further support for the clinical evidence that EMD increases the speed and quality of new bone formation in vivo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Delineating brain tumor boundaries from magnetic resonance images is an essential task for the analysis of brain cancer. We propose a fully automatic method for brain tissue segmentation, which combines Support Vector Machine classification using multispectral intensities and textures with subsequent hierarchical regularization based on Conditional Random Fields. The CRF regularization introduces spatial constraints to the powerful SVM classification, which assumes voxels to be independent from their neighbors. The approach first separates healthy and tumor tissue before both regions are subclassified into cerebrospinal fluid, white matter, gray matter and necrotic, active, edema region respectively in a novel hierarchical way. The hierarchical approach adds robustness and speed by allowing to apply different levels of regularization at different stages. The method is fast and tailored to standard clinical acquisition protocols. It was assessed on 10 multispectral patient datasets with results outperforming previous methods in terms of segmentation detail and computation times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two experiments investigated the structure of memory for titles of 54 familiar tunes. The titles were presented in the form of a hierarchy, with nodes labeled by genre (e.g., Rock or Patriotic). Four groups of subjects received logical or randomized titles, and logical or randomized labels. Goodness of label and title structure had equal and additive beneficial effects on recall with a 3-min exposure of the stimuli. With a 4-min exposure, good title structure became a larger contributor to good recall. Clustering analyses suggested that subjects were mentally representing the tune titles hierarchically, even when presentation was random.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A central design challenge facing network planners is how to select a cost-effective network configuration that can provide uninterrupted service despite edge failures. In this paper, we study the Survivable Network Design (SND) problem, a core model underlying the design of such resilient networks that incorporates complex cost and connectivity trade-offs. Given an undirected graph with specified edge costs and (integer) connectivity requirements between pairs of nodes, the SND problem seeks the minimum cost set of edges that interconnects each node pair with at least as many edge-disjoint paths as the connectivity requirement of the nodes. We develop a hierarchical approach for solving the problem that integrates ideas from decomposition, tabu search, randomization, and optimization. The approach decomposes the SND problem into two subproblems, Backbone design and Access design, and uses an iterative multi-stage method for solving the SND problem in a hierarchical fashion. Since both subproblems are NP-hard, we develop effective optimization-based tabu search strategies that balance intensification and diversification to identify near-optimal solutions. To initiate this method, we develop two heuristic procedures that can yield good starting points. We test the combined approach on large-scale SND instances, and empirically assess the quality of the solutions vis-à-vis optimal values or lower bounds. On average, our hierarchical solution approach generates solutions within 2.7% of optimality even for very large problems (that cannot be solved using exact methods), and our results demonstrate that the performance of the method is robust for a variety of problems with different size and connectivity characteristics.