858 resultados para optimization-based similarity reasoning


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Photovoltaic (PV) solar panels generally produce electricity in the 6% to 16% efficiency range, the rest being dissipated in thermal losses. To recover this amount, hybrid photovoltaic thermal systems (PVT) have been devised. These are devices that simultaneously convert solar energy into electricity and heat. It is thus interesting to study the PVT system globally from different point of views in order to evaluate advantages and disadvantages of this technology and its possible uses. In particular in Chapter II, the development of the PVT absorber numerical optimization by a genetic algorithm has been carried out analyzing different internal channel profiles in order to find a right compromise between performance and technical and economical feasibility. Therefore in Chapter III ,thanks to a mobile structure built into the university lab, it has been compared experimentally electrical and thermal output power from PVT panels with separated photovoltaic and solar thermal productions. Collecting a lot of experimental data based on different seasonal conditions (ambient temperature,irradiation, wind...),the aim of this mobile structure has been to evaluate average both thermal and electrical increasing and decreasing efficiency values obtained respect to separate productions through the year. In Chapter IV , new PVT and solar thermal equation based models in steady state conditions have been developed by software Dymola that uses Modelica language. This permits ,in a simplified way respect to previous system modelling softwares, to model and evaluate different concepts about PVT panel regarding its structure before prototyping and measuring it. Chapter V concerns instead the definition of PVT boundary conditions into a HVAC system . This was made trough year simulations by software Polysun in order to finally assess the best solar assisted integrated structure thanks to F_save(solar saving energy)factor. Finally, Chapter VI presents the conclusion and the perspectives of this PhD work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel nanosized and addressable sensing platform based on membrane coated plasmonic particles for detection of protein adsorption using dark field scattering spectroscopy of single particles has been established. To this end, a detailed analysis of the deposition of gold nanorods on differently functionalized substrates is performed in relation to various factors (such as the pH, ionic strength, concentration of colloidal suspension, incubation time) in order to find the optimal conditions for obtaining a homogenous distribution of particles at the desired surface number density. The possibility of successfully draping lipid bilayers over the gold particles immobilized on glass substrates depends on the careful adjustment of parameters such as membrane curvature and adhesion properties and is demonstrated with complementary techniques such as phase imaging AFM, fluorescence microscopy (including FRAP) and single particle spectroscopy. The functionality and sensitivity of the proposed sensing platform is unequivocally certified by the resonance shifts of the plasmonic particles that were individually interrogated with single particle spectroscopy upon the adsorption of streptavidin to biotinylated lipid membranes. This new detection approach that employs particles as nanoscopic reporters for biomolecular interactions insures a highly localized sensitivity that offers the possibility to screen lateral inhomogeneities of native membranes. As an alternative to the 2D array of gold nanorods, short range ordered arrays of nanoholes in optically transparent gold films or regular arrays of truncated tetrahedron shaped particles are built by means of colloidal nanolithography on transparent substrates. Technical issues mainly related to the optimization of the mask deposition conditions are successfully addressed such that extended areas of homogenously nanostructured gold surfaces are achieved. Adsorption of the proteins annexin A1 and prothrombin on multicomponent lipid membranes as well as the hydrolytic activity of the phospholipase PLA2 were investigated with classical techniques such as AFM, ellipsometry and fluorescence microscopy. At first, the issues of lateral phase separation in membranes of various lipid compositions and the dependency of the domains configuration (sizes and shapes) on the membrane content are addressed. It is shown that the tendency for phase segregation of gel and fluid phase lipid mixtures is accentuated in the presence of divalent calcium ions for membranes containing anionic lipids as compared to neutral bilayers. Annexin A1 adsorbs preferentially and irreversibly on preformed phosphatidylserine (PS) enriched lipid domains but, dependent on the PS content of the bilayer, the protein itself may induce clustering of the anionic lipids into areas with high binding affinity. Corroborated evidence from AFM and fluorescence experiments confirm the hypothesis of a specifically increased hydrolytic activity of PLA2 on the highly curved regions of membranes due to a facilitated access of lipase to the cleavage sites of the lipids. The influence of the nanoscale gold surface topography on the adhesion of lipid vesicles is unambiguously demonstrated and this reveals, at least in part, an answer for the controversial question existent in the literature about the behavior of lipid vesicles interacting with bare gold substrates. The possibility of formation monolayers of lipid vesicles on chemically untreated gold substrates decorated with gold nanorods opens new perspectives for biosensing applications that involve the radiative decay engineering of the plasmonic particles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last 60 years, computers and software have favoured incredible advancements in every field. Nowadays, however, these systems are so complicated that it is difficult – if not challenging – to understand whether they meet some requirement or are able to show some desired behaviour or property. This dissertation introduces a Just-In-Time (JIT) a posteriori approach to perform the conformance check to identify any deviation from the desired behaviour as soon as possible, and possibly apply some corrections. The declarative framework that implements our approach – entirely developed on the promising open source forward-chaining Production Rule System (PRS) named Drools – consists of three components: 1. a monitoring module based on a novel, efficient implementation of Event Calculus (EC), 2. a general purpose hybrid reasoning module (the first of its genre) merging temporal, semantic, fuzzy and rule-based reasoning, 3. a logic formalism based on the concept of expectations introducing Event-Condition-Expectation rules (ECE-rules) to assess the global conformance of a system. The framework is also accompanied by an optional module that provides Probabilistic Inductive Logic Programming (PILP). By shifting the conformance check from after execution to just in time, this approach combines the advantages of many a posteriori and a priori methods proposed in literature. Quite remarkably, if the corrective actions are explicitly given, the reactive nature of this methodology allows to reconcile any deviations from the desired behaviour as soon as it is detected. In conclusion, the proposed methodology brings some advancements to solve the problem of the conformance checking, helping to fill the gap between humans and the increasingly complex technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dextran-based polymers are versatile hydrophilic materials, which can provide functionalized surfaces in various areas including biological and medical applications. Functional, responsive, dextran based hydrogels are crosslinked, dextran based polymers allowing the modulation of response towards external stimuli. The controlled modulation of hydrogel properties towards specific applications and the detailed characterization of the optical, mechanical, and chemical properties are of strong interest in science and further applications. Especially, the structural characteristics of swollen hydrogel matrices and the characterization of their variations upon environmental changes are challenging. Depending on their properties hydrogels are applied as actuators, biosensors, in drug delivery, tissue engineering, or for medical coatings. However, the field of possible applications still shows potential to be expanded. rnSurface attached hydrogel films with a thickness of several micrometers can serve as waveguiding matrix for leaky optical waveguide modes. On the basis of highly swelling and waveguiding dextran based hydrogel films an optical biosensor concept was developed. The synthesis of a dextran based hydrogel matrix, its functionalization to modulate its response towards external stimuli, and the characterization of the swollen hydrogel films were main interests within this biosensor project. A second focus was the optimization of the hydrogel characteristics for cell growth with the aim of creating scaffolds for bone regeneration. Matrix modification towards successful cell growth experiments with endothelial cells and osteoblasts was achieved.rnA photo crosslinkable, carboxymethylated dextran based hydrogel (PCMD) was synthesized and characterized in terms of swelling behaviour and structural properties. Further functionalization was carried out before and after crosslinking. This functionalization aimed towards external manipulation of the swelling degree and the charge of the hydrogel matrix important for biosensor experiments as well as for cell adhesion. The modulation of functionalized PCMD hydrogel responses to pH, ion concentration, electrochemical switching, or a magnetic force was investigated. rnThe PCMD hydrogel films were optically characterized by combining surface plasmon resonance (SPR) and optical waveguide mode spectroscopy (OWS). This technique allows a detailed analysis of the refractive index profile perpendicular to the substrate surface by applying the Wentzel Kramers Brillouin (WKB) approximation. rnIn order to perform biosensor experiments, analyte capturing units such as proteins or antibodies were covalently coupled to the crosslinked hydrogel backbone by applying active ester chemistry. Consequently, target analytes could be located inside the waveguiding matrix. By using labeled analytes, fluorescence enhancement was achieved by fluorescence excitation with the electromagnetic field in the center of the optical waveguide modes. The fluorescence excited by the evanescent electromagnetic field of the surface plasmon was 2 3 orders of magnitude lower. Furthermore, the signal to noise ratio was improved by the fluorescence excitation with leaky optical waveguide modes.rnThe applicability of the PCMD hydrogel sensor matrix for clinically relevant samples was proofed in a cooperation project for the detection of PSA in serum with long range surface plasmon spectroscopy (LRSP) and fluorescence excitation by LRSP (LR SPFS). rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents some different techniques designed to drive a swarm of robots in an a-priori unknown environment in order to move the group from a starting area to a final one avoiding obstacles. The presented techniques are based on two different theories used alone or in combination: Swarm Intelligence (SI) and Graph Theory. Both theories are based on the study of interactions between different entities (also called agents or units) in Multi- Agent Systems (MAS). The first one belongs to the Artificial Intelligence context and the second one to the Distributed Systems context. These theories, each one from its own point of view, exploit the emergent behaviour that comes from the interactive work of the entities, in order to achieve a common goal. The features of flexibility and adaptability of the swarm have been exploited with the aim to overcome and to minimize difficulties and problems that can affect one or more units of the group, having minimal impact to the whole group and to the common main target. Another aim of this work is to show the importance of the information shared between the units of the group, such as the communication topology, because it helps to maintain the environmental information, detected by each single agent, updated among the swarm. Swarm Intelligence has been applied to the presented technique, through the Particle Swarm Optimization algorithm (PSO), taking advantage of its features as a navigation system. The Graph Theory has been applied by exploiting Consensus and the application of the agreement protocol with the aim to maintain the units in a desired and controlled formation. This approach has been followed in order to conserve the power of PSO and to control part of its random behaviour with a distributed control algorithm like Consensus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ziel dieser Dissertation ist die experimentelle Charakterisierung und quantitative Beschreibung der Hybridisierung von komplementären Nukleinsäuresträngen mit oberflächengebundenen Fängermolekülen für die Entwicklung von integrierten Biosensoren. Im Gegensatz zu lösungsbasierten Verfahren ist mit Microarray Substraten die Untersuchung vieler Nukleinsäurekombinationen parallel möglich. Als biologisch relevantes Evaluierungssystem wurde das in Eukaryoten universell exprimierte Actin Gen aus unterschiedlichen Pflanzenspezies verwendet. Dieses Testsystem ermöglicht es, nahe verwandte Pflanzenarten auf Grund von geringen Unterschieden in der Gen-Sequenz (SNPs) zu charakterisieren. Aufbauend auf dieses gut studierte Modell eines House-Keeping Genes wurde ein umfassendes Microarray System, bestehend aus kurzen und langen Oligonukleotiden (mit eingebauten LNA-Molekülen), cDNAs sowie DNA und RNA Targets realisiert. Damit konnte ein für online Messung optimiertes Testsystem mit hohen Signalstärken entwickelt werden. Basierend auf den Ergebnissen wurde der gesamte Signalpfad von Nukleinsärekonzentration bis zum digitalen Wert modelliert. Die aus der Entwicklung und den Experimenten gewonnen Erkenntnisse über die Kinetik und Thermodynamik von Hybridisierung sind in drei Publikationen zusammengefasst die das Rückgrat dieser Dissertation bilden. Die erste Publikation beschreibt die Verbesserung der Reproduzierbarkeit und Spezifizität von Microarray Ergebnissen durch online Messung von Kinetik und Thermodynamik gegenüber endpunktbasierten Messungen mit Standard Microarrays. Für die Auswertung der riesigen Datenmengen wurden zwei Algorithmen entwickelt, eine reaktionskinetische Modellierung der Isothermen und ein auf der Fermi-Dirac Statistik beruhende Beschreibung des Schmelzüberganges. Diese Algorithmen werden in der zweiten Publikation beschrieben. Durch die Realisierung von gleichen Sequenzen in den chemisch unterschiedlichen Nukleinsäuren (DNA, RNA und LNA) ist es möglich, definierte Unterschiede in der Konformation des Riboserings und der C5-Methylgruppe der Pyrimidine zu untersuchen. Die kompetitive Wechselwirkung dieser unterschiedlichen Nukleinsäuren gleicher Sequenz und die Auswirkungen auf Kinetik und Thermodynamik ist das Thema der dritten Publikation. Neben der molekularbiologischen und technologischen Entwicklung im Bereich der Sensorik von Hybridisierungsreaktionen oberflächengebundener Nukleinsäuremolekülen, der automatisierten Auswertung und Modellierung der anfallenden Datenmengen und der damit verbundenen besseren quantitativen Beschreibung von Kinetik und Thermodynamik dieser Reaktionen tragen die Ergebnisse zum besseren Verständnis der physikalisch-chemischen Struktur des elementarsten biologischen Moleküls und seiner nach wie vor nicht vollständig verstandenen Spezifizität bei.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research aims at developing a framework for semantic-based digital survey of architectural heritage. Rooted in knowledge-based modeling which extracts mathematical constraints of geometry from architectural treatises, as-built information of architecture obtained from image-based modeling is integrated with the ideal model in BIM platform. The knowledge-based modeling transforms the geometry and parametric relation of architectural components from 2D printings to 3D digital models, and create large amount variations based on shape grammar in real time thanks to parametric modeling. It also provides prior knowledge for semantically segmenting unorganized survey data. The emergence of SfM (Structure from Motion) provides access to reconstruct large complex architectural scenes with high flexibility, low cost and full automation, but low reliability of metric accuracy. We solve this problem by combing photogrammetric approaches which consists of camera configuration, image enhancement, and bundle adjustment, etc. Experiments show the accuracy of image-based modeling following our workflow is comparable to that from range-based modeling. We also demonstrate positive results of our optimized approach in digital reconstruction of portico where low-texture-vault and dramatical transition of illumination bring huge difficulties in the workflow without optimization. Once the as-built model is obtained, it is integrated with the ideal model in BIM platform which allows multiple data enrichment. In spite of its promising prospect in AEC industry, BIM is developed with limited consideration of reverse-engineering from survey data. Besides representing the architectural heritage in parallel ways (ideal model and as-built model) and comparing their difference, we concern how to create as-built model in BIM software which is still an open area to be addressed. The research is supposed to be fundamental for research of architectural history, documentation and conservation of architectural heritage, and renovation of existing buildings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many application domains data can be naturally represented as graphs. When the application of analytical solutions for a given problem is unfeasible, machine learning techniques could be a viable way to solve the problem. Classical machine learning techniques are defined for data represented in a vectorial form. Recently some of them have been extended to deal directly with structured data. Among those techniques, kernel methods have shown promising results both from the computational complexity and the predictive performance point of view. Kernel methods allow to avoid an explicit mapping in a vectorial form relying on kernel functions, which informally are functions calculating a similarity measure between two entities. However, the definition of good kernels for graphs is a challenging problem because of the difficulty to find a good tradeoff between computational complexity and expressiveness. Another problem we face is learning on data streams, where a potentially unbounded sequence of data is generated by some sources. There are three main contributions in this thesis. The first contribution is the definition of a new family of kernels for graphs based on Directed Acyclic Graphs (DAGs). We analyzed two kernels from this family, achieving state-of-the-art results from both the computational and the classification point of view on real-world datasets. The second contribution consists in making the application of learning algorithms for streams of graphs feasible. Moreover,we defined a principled way for the memory management. The third contribution is the application of machine learning techniques for structured data to non-coding RNA function prediction. In this setting, the secondary structure is thought to carry relevant information. However, existing methods considering the secondary structure have prohibitively high computational complexity. We propose to apply kernel methods on this domain, obtaining state-of-the-art results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the research activity focused on the investigation of the correlation between the degree of purity in terms of chemical dopants in organic small molecule semiconductors and their electrical and optoelectronic performances once introduced as active material in devices. The first step of the work was addressed to the study of the electrical performances variation of two commercial organic semiconductors after being processed by means of thermal sublimation process. In particular, the p-type 2,2′′′-Dihexyl-2,2′:5′,2′′:5′′,2′′′-quaterthiophene (DH4T) semiconductor and the n-type 2,2′′′- Perfluoro-Dihexyl-2,2′:5′,2′′:5′′,2′′′-quaterthiophene (DFH4T) semiconductor underwent several sublimation cycles, with consequent improvement of the electrical performances in terms of charge mobility and threshold voltage, highlighting the benefits brought by this treatment to the electric properties of the discussed semiconductors in OFET devices by the removal of residual impurities. The second step consisted in the provision of a metal-free synthesis of DH4T, which was successfully prepared without organometallic reagents or catalysts in collaboration with Dr. Manuela Melucci from ISOF-CNR Institute in Bologna. Indeed the experimental work demonstrated that those compounds are responsible for the electrical degradation by intentionally doping the semiconductor obtained by metal-free method by Tetrakis(triphenylphosphine)palladium(0) (Pd(PPh3)4) and Tributyltin chloride (Bu3SnCl), as well as with an organic impurity, like 5-hexyl-2,2':5',2''-terthiophene (HexT3) at, in different concentrations (1, 5 and 10% w/w). After completing the entire evaluation process loop, from fabricating OFET devices by vacuum sublimation with implemented intentionally-doped batches to the final electrical characterization in inherent-atmosphere conditions, commercial DH4T, metal-free DH4T and the intentionally-doped DH4T were systematically compared. Indeed, the fabrication of OFET based on doped DH4T clearly pointed out that the vacuum sublimation is still an inherent and efficient purification method for crude semiconductors, but also a reliable way to fabricate high performing devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geometric packing problems may be formulated mathematically as constrained optimization problems. But finding a good solution is a challenging task. The more complicated the geometry of the container or the objects to be packed, the more complex the non-penetration constraints become. In this work we propose the use of a physics engine that simulates a system of colliding rigid bodies. It is a tool to resolve interpenetration conflicts and to optimize configurations locally. We develop an efficient and easy-to-implement physics engine that is specialized for collision detection and contact handling. In succession of the development of this engine a number of novel algorithms for distance calculation and intersection volume were designed and imple- mented, which are presented in this work. They are highly specialized to pro- vide fast responses for cuboids and triangles as input geometry whereas the concepts they are based on can easily be extended to other convex shapes. Especially noteworthy in this context is our ε-distance algorithm - a novel application that is not only very robust and fast but also compact in its im- plementation. Several state-of-the-art third party implementations are being presented and we show that our implementations beat them in runtime and robustness. The packing algorithm that lies on top of the physics engine is a Monte Carlo based approach implemented for packing cuboids into a container described by a triangle soup. We give an implementation for the SAE J1100 variant of the trunk packing problem. We compare this implementation to several established approaches and we show that it gives better results in faster time than these existing implementations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays the rise of non-recurring engineering (NRE) costs associated with complexity is becoming a major factor in SoC design, limiting both scaling opportunities and the flexibility advantages offered by the integration of complex computational units. The introduction of embedded programmable elements can represent an appealing solution, able both to guarantee the desired flexibility and upgradabilty and to widen the SoC market. In particular embedded FPGA (eFPGA) cores can provide bit-level optimization for those applications which benefits from synthesis, paying on the other side in terms of performance penalties and area overhead with respect to standard cell ASIC implementations. In this scenario this thesis proposes a design methodology for a synthesizable programmable device designed to be embedded in a SoC. A soft-core embedded FPGA (eFPGA) is hence presented and analyzed in terms of the opportunities given by a fully synthesizable approach, following an implementation flow based on Standard-Cell methodology. A key point of the proposed eFPGA template is that it adopts a Multi-Stage Switching Network (MSSN) as the foundation of the programmable interconnects, since it can be efficiently synthesized and optimized through a standard cell based implementation flow, ensuring at the same time an intrinsic congestion-free network topology. The evaluation of the flexibility potentialities of the eFPGA has been performed using different technology libraries (STMicroelectronics CMOS 65nm and BCD9s 0.11μm) through a design space exploration in terms of area-speed-leakage tradeoffs, enabled by the full synthesizability of the template. Since the most relevant disadvantage of the adopted soft approach, compared to a hardcore, is represented by a performance overhead increase, the eFPGA analysis has been made targeting small area budgets. The generation of the configuration bitstream has been obtained thanks to the implementation of a custom CAD flow environment, and has allowed functional verification and performance evaluation through an application-aware analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Rahmen der vorliegenden Arbeit wurde ein neuartiger Zugang zu einer Vielzahl von Polymerstrukturen auf Basis des klinisch zugelassenen Polymers Poly(N-(2-Hydroxypropyl)-methacrylamide) (PHPMA) entwickelt. Der synthetische Zugang beruht zum einen auf der Verwendung von Reaktivesterpolymeren und zum anderen auf der Reversible Addition Fragmentation Chain Transfer (RAFT) Polymerisationsmethode. Diese Form einer kontrollierten radikalischen Polymerisation ermöglichte es, neben der Synthese von besser definierten Homopolymeren auch statistische und Blockcopolymere herzustellen. Die Reaktivesterpolymere können durch einfache Aminolyse in HPMA-basierte Systeme überführt werden. Somit können sie als eine vielversprechende Basis zur Synthese von umfangreichen Polymerbibliotheken angesehen werden. Die hergestellten Polymere kombinieren verschiedene Funktionalitäten bei konstantem Polymerisationsgrad. Dies ermöglicht eine Optimierung auf eine gezielte Anwendung hin ohne den Parameter der Kettenlänge zu verändern.rnIm weiteren war es durch Verwendung der RAFT Polymerisation möglich partiell bioabbaubare Blockcopolymere auf Basis von Polylactiden und HPMA herzustellen, in dem ein Kettentransferreagenz (CTA) an ein wohl definiertes Polylactid Homopolymer gekoppelt wurde. Diese Strukturen wurden in ihrer Zusammensetzung variiert und mit Erkennungsstrukturen (Folaten) und markierenden Elementen (Fluoreszenzfarbstoffe und +-emittierenden Radionukleide) versehen und im weiteren in vitro und in vivo evaluiert.rnAuf Grund dieser Errungenschaften war es möglich den Einfluss der Polymermikrostruktur auf das Aggregationsverhalten hin mittel Lichtstreuung und Fluoreszenzkorrelationsspektroskopie zu untersuchen. Es konnte gezeigt werden, dass erst diese Informationen über die Überstrukturbildung die Kinetik der Zellaufnahme erklären können. Somit wurde die wichtige Rolle von Strukturwirkungsbeziehungen nachgewiesen.rnSomit konnte neben der Synthese, Charakterisierung und ersten biologischen Evaluierungen ein Beitrag zum besseres Verständnis zur Interaktion von polymeren Partikeln mit biologischen Systemen geleistet werden.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, the author presents a query language for an RDF (Resource Description Framework) database and discusses its applications in the context of the HELM project (the Hypertextual Electronic Library of Mathematics). This language aims at meeting the main requirements coming from the RDF community. in particular it includes: a human readable textual syntax and a machine-processable XML (Extensible Markup Language) syntax both for queries and for query results, a rigorously exposed formal semantics, a graph-oriented RDF data access model capable of exploring an entire RDF graph (including both RDF Models and RDF Schemata), a full set of Boolean operators to compose the query constraints, fully customizable and highly structured query results having a 4-dimensional geometry, some constructions taken from ordinary programming languages that simplify the formulation of complex queries. The HELM project aims at integrating the modern tools for the automation of formal reasoning with the most recent electronic publishing technologies, in order create and maintain a hypertextual, distributed virtual library of formal mathematical knowledge. In the spirit of the Semantic Web, the documents of this library include RDF metadata describing their structure and content in a machine-understandable form. Using the author's query engine, HELM exploits this information to implement some functionalities allowing the interactive and automatic retrieval of documents on the basis of content-aware requests that take into account the mathematical nature of these documents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

n this paper we present a novel hybrid approach for multimodal medical image registration based on diffeomorphic demons. Diffeomorphic demons have proven to be a robust and efficient way for intensity-based image registration. A very recent extension even allows to use mutual information (MI) as a similarity measure to registration multimodal images. However, due to the intensity correspondence uncertainty existing in some anatomical parts, it is difficult for a purely intensity-based algorithm to solve the registration problem. Therefore, we propose to combine the resulting transformations from both intensity-based and landmark-based methods for multimodal non-rigid registration based on diffeomorphic demons. Several experiments on different types of MR images were conducted, for which we show that a better anatomical correspondence between the images can be obtained using the hybrid approach than using either intensity information or landmarks alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a variational approach for multimodal image registration based on the diffeomorphic demons algorithm. Diffeomorphic demons has proven to be a robust and efficient way for intensity-based image registration. However, the main drawback is that it cannot deal with multiple modalities. We propose to replace the standard demons similarity metric (image intensity differences) by point-wise mutual information (PMI) in the energy function. By comparing the accuracy between our PMI based diffeomorphic demons and the B-Spline based free-form deformation approach (FFD) on simulated deformations, we show the proposed algorithm performs significantly better.