892 resultados para Shadow and Highlight Invariant Algorithm.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently there has been a considerable interest in dynamic textures due to the explosive growth of multimedia databases. In addition, dynamic texture appears in a wide range of videos, which makes it very important in applications concerning to model physical phenomena. Thus, dynamic textures have emerged as a new field of investigation that extends the static or spatial textures to the spatio-temporal domain. In this paper, we propose a novel approach for dynamic texture segmentation based on automata theory and k-means algorithm. In this approach, a feature vector is extracted for each pixel by applying deterministic partially self-avoiding walks on three orthogonal planes of the video. Then, these feature vectors are clustered by the well-known k-means algorithm. Although the k-means algorithm has shown interesting results, it only ensures its convergence to a local minimum, which affects the final result of segmentation. In order to overcome this drawback, we compare six methods of initialization of the k-means. The experimental results have demonstrated the effectiveness of our proposed approach compared to the state-of-the-art segmentation methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Permitida la difusión del código bajo los términos de la licencia BSD de tres cláusulas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last years of research, I focused my studies on different physiological problems. Together with my supervisors, I developed/improved different mathematical models in order to create valid tools useful for a better understanding of important clinical issues. The aim of all this work is to develop tools for learning and understanding cardiac and cerebrovascular physiology as well as pathology, generating research questions and developing clinical decision support systems useful for intensive care unit patients. I. ICP-model Designed for Medical Education We developed a comprehensive cerebral blood flow and intracranial pressure model to simulate and study the complex interactions in cerebrovascular dynamics caused by multiple simultaneous alterations, including normal and abnormal functional states of auto-regulation of the brain. Individual published equations (derived from prior animal and human studies) were implemented into a comprehensive simulation program. Included in the normal physiological modelling was: intracranial pressure, cerebral blood flow, blood pressure, and carbon dioxide (CO2) partial pressure. We also added external and pathological perturbations, such as head up position and intracranial haemorrhage. The model performed clinically realistically given inputs of published traumatized patients, and cases encountered by clinicians. The pulsatile nature of the output graphics was easy for clinicians to interpret. The manoeuvres simulated include changes of basic physiological inputs (e.g. blood pressure, central venous pressure, CO2 tension, head up position, and respiratory effects on vascular pressures) as well as pathological inputs (e.g. acute intracranial bleeding, and obstruction of cerebrospinal outflow). Based on the results, we believe the model would be useful to teach complex relationships of brain haemodynamics and study clinical research questions such as the optimal head-up position, the effects of intracranial haemorrhage on cerebral haemodynamics, as well as the best CO2 concentration to reach the optimal compromise between intracranial pressure and perfusion. We believe this model would be useful for both beginners and advanced learners. It could be used by practicing clinicians to model individual patients (entering the effects of needed clinical manipulations, and then running the model to test for optimal combinations of therapeutic manoeuvres). II. A Heterogeneous Cerebrovascular Mathematical Model Cerebrovascular pathologies are extremely complex, due to the multitude of factors acting simultaneously on cerebral haemodynamics. In this work, the mathematical model of cerebral haemodynamics and intracranial pressure dynamics, described in the point I, is extended to account for heterogeneity in cerebral blood flow. The model includes the Circle of Willis, six regional districts independently regulated by autoregulation and CO2 reactivity, distal cortical anastomoses, venous circulation, the cerebrospinal fluid circulation, and the intracranial pressure-volume relationship. Results agree with data in the literature and highlight the existence of a monotonic relationship between transient hyperemic response and the autoregulation gain. During unilateral internal carotid artery stenosis, local blood flow regulation is progressively lost in the ipsilateral territory with the presence of a steal phenomenon, while the anterior communicating artery plays the major role to redistribute the available blood flow. Conversely, distal collateral circulation plays a major role during unilateral occlusion of the middle cerebral artery. In conclusion, the model is able to reproduce several different pathological conditions characterized by heterogeneity in cerebrovascular haemodynamics and can not only explain generalized results in terms of physiological mechanisms involved, but also, by individualizing parameters, may represent a valuable tool to help with difficult clinical decisions. III. Effect of Cushing Response on Systemic Arterial Pressure. During cerebral hypoxic conditions, the sympathetic system causes an increase in arterial pressure (Cushing response), creating a link between the cerebral and the systemic circulation. This work investigates the complex relationships among cerebrovascular dynamics, intracranial pressure, Cushing response, and short-term systemic regulation, during plateau waves, by means of an original mathematical model. The model incorporates the pulsating heart, the pulmonary circulation and the systemic circulation, with an accurate description of the cerebral circulation and the intracranial pressure dynamics (same model as in the first paragraph). Various regulatory mechanisms are included: cerebral autoregulation, local blood flow control by oxygen (O2) and/or CO2 changes, sympathetic and vagal regulation of cardiovascular parameters by several reflex mechanisms (chemoreceptors, lung-stretch receptors, baroreceptors). The Cushing response has been described assuming a dramatic increase in sympathetic activity to vessels during a fall in brain O2 delivery. With this assumption, the model is able to simulate the cardiovascular effects experimentally observed when intracranial pressure is artificially elevated and maintained at constant level (arterial pressure increase and bradicardia). According to the model, these effects arise from the interaction between the Cushing response and the baroreflex response (secondary to arterial pressure increase). Then, patients with severe head injury have been simulated by reducing intracranial compliance and cerebrospinal fluid reabsorption. With these changes, oscillations with plateau waves developed. In these conditions, model results indicate that the Cushing response may have both positive effects, reducing the duration of the plateau phase via an increase in cerebral perfusion pressure, and negative effects, increasing the intracranial pressure plateau level, with a risk of greater compression of the cerebral vessels. This model may be of value to assist clinicians in finding the balance between clinical benefits of the Cushing response and its shortcomings. IV. Comprehensive Cardiopulmonary Simulation Model for the Analysis of Hypercapnic Respiratory Failure We developed a new comprehensive cardiopulmonary model that takes into account the mutual interactions between the cardiovascular and the respiratory systems along with their short-term regulatory mechanisms. The model includes the heart, systemic and pulmonary circulations, lung mechanics, gas exchange and transport equations, and cardio-ventilatory control. Results show good agreement with published patient data in case of normoxic and hyperoxic hypercapnia simulations. In particular, simulations predict a moderate increase in mean systemic arterial pressure and heart rate, with almost no change in cardiac output, paralleled by a relevant increase in minute ventilation, tidal volume and respiratory rate. The model can represent a valid tool for clinical practice and medical research, providing an alternative way to experience-based clinical decisions. In conclusion, models are not only capable of summarizing current knowledge, but also identifying missing knowledge. In the former case they can serve as training aids for teaching the operation of complex systems, especially if the model can be used to demonstrate the outcome of experiments. In the latter case they generate experiments to be performed to gather the missing data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research focuses on the definition of the complex relationship that exists between theory and project, which - in the architectural work by Oswald Mathias Ungers - is based on several essays and on the publications that - though they have never been collected in an organic text - make up an articulated corpus, so that it is possible to consider it as the foundations of a theory. More specifically, this thesis deals with the role of metaphor in Unger’s theory and its subsequent practical application to his projects. The path leading from theoretical analysis to architectural project is in Ungers’ view a slow and mediated path, where theory is an instrument without which it would not be possible to create the project's foundations. The metaphor is a figure of speech taken from disciplines such as philosophy, aesthetics, linguistics. Using a metaphor implies a transfer of meaning, as it is essentially based on the replacement of a real object with a figurative one. The research is articulated in three parts, each of them corresponding to a text by Ungers that is considered as crucial to understand the development of his architectural thinking. Each text marks three decades of Ungers’ work: the sixties, seventies and eighties. The first part of the research deals with the topic of Großform expressed by Ungers in his publication of 1966 Grossformen im Wohnungsbau, where he defines four criteria based on which architecture identifies with a Großform. One of the hypothesis underlying this study is that there is a relationship between the notion of Großform and the figure of metaphor. The second part of the thesis analyzes the time between the end of the sixties and the seventies, i.e. the time during which Ungers lived in the USA and taught at the Cornell University of Ithaca. The analysis focuses on the text Entwerfen und Denken in Vorstellungen, Metaphern und Analogien, written by Ungers in 1976, for the exhibition MAN transFORMS organized in the Cooper - Hewitt Museum in New York. This text, through which Ungers creates a sort of vocabulary to explain the notions of metaphor, analogy, signs, symbols and allegories, can be defined as the Manifesto of his architectural theory, the latter being strictly intertwined with the metaphor as a design instrument and which is best expressed when he introduces the 11 thesis with P. Koolhaas, P. Riemann, H. Kollhoff and A. Ovaska in Die Stadt in der Stadt in 1977. Berlin das grüne Stadtarchipel. The third part analyzes the indissoluble tie between the use of metaphor and the choice of the topic on which the project is based and, starting from Ungers’ publication in 1982 Architecture as theme, the relationship between idea/theme and image/metaphor is explained. Playing with shapes requires metaphoric thinking, i.e. taking references to create new ideas from the world of shapes and not just from architecture. The metaphor as a tool to interpret reality becomes for Ungers an inquiry method that precedes a project and makes it possible to define the theme on which the project will be based. In Ungers’ case, the architecture of ideas matches the idea of architecture; for Ungers the notions of idea and theme, image and metaphor cannot be separated from each other, the text on thematization of architecture is not a report of his projects, but it represents the need to put them in order and highlight the theme on which they are based.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Im Rahmen der vorliegenden Dissertation wurde, basierend auf der Parallel-/Orthogonalraum-Methode, eine neue Methode zur Berechnung von allgemeinen massiven Zweischleifen-Dreipunkt-Tensorintegralen mit planarer und gedrehter reduzierter planarer Topologie entwickelt. Die Ausarbeitung und Implementation einer Tensorreduktion fuer Integrale, welche eine allgemeine Tensorstruktur im Minkowski-Raum besitzen koennen, wurde durchgefuehrt. Die Entwicklung und Implementation eines Algorithmus zur semi-analytischen Berechnung der schwierigsten Integrale, die nach der Tensorreduktion verbleiben, konnte vollendet werden. (Fuer die anderen Basisintegrale koennen wohlbekannte Methoden verwendet werden.) Die Implementation ist bezueglich der UV-endlichen Anteile der Masterintegrale, die auch nach Tensorreduktion noch die zuvor erwaehnten Topologien besitzen, abgeschlossen. Die numerischen Integrationen haben sich als stabil erwiesen. Fuer die verbleibenden Teile des Projektes koennen wohlbekannte Methoden verwendet werden. In weiten Teilen muessen lediglich noch Links zu existierenden Programmen geschrieben werden. Fuer diejenigen wenigen verbleibenden speziellen Topologien, welche noch zu implementieren sind, sind (wohlbekannte) Methoden zu implementieren. Die Computerprogramme, die im Rahmen dieses Projektes entstanden, werden auch fuer allgemeinere Prozesse in das xloops-Projekt einfliessen. Deswegen wurde sie soweit moeglich fuer allgemeine Prozesse entwickelt und implementiert. Der oben erwaehnte Algorithmus wurde insbesondere fuer die Evaluation der fermionischen NNLO-Korrekturen zum leptonischen schwachen Mischungswinkel sowie zu aehnlichen Prozessen entwickelt. Im Rahmen der vorliegenden Dissertation wurde ein Grossteil der fuer die fermionischen NNLO-Korrekturen zu den effektiven Kopplungskonstanten des Z-Zerfalls (und damit fuer den schachen Mischungswinkel) notwendigen Arbeit durchgefuehrt.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In der vorliegenden Arbeit wird die Faktorisierungsmethode zur Erkennung von Gebieten mit sprunghaft abweichenden Materialparametern untersucht. Durch eine abstrakte Formulierung beweisen wir die der Methode zugrunde liegende Bildraumidentität für allgemeine reelle elliptische Probleme und deduzieren bereits bekannte und neue Anwendungen der Methode. Für das spezielle Problem, magnetische oder perfekt elektrisch leitende Objekte durch niederfrequente elektromagnetische Strahlung zu lokalisieren, zeigen wir die eindeutige Lösbarkeit des direkten Problems für hinreichend kleine Frequenzen und die Konvergenz der Lösungen gegen die der elliptischen Gleichungen der Magnetostatik. Durch Anwendung unseres allgemeinen Resultats erhalten wir die eindeutige Rekonstruierbarkeit der gesuchten Objekte aus elektromagnetischen Messungen und einen numerischen Algorithmus zur Lokalisierung der Objekte. An einem Musterproblem untersuchen wir, wie durch parabolische Differentialgleichungen beschriebene Einschlüsse in einem durch elliptische Differentialgleichungen beschriebenen Gebiet rekonstruiert werden können. Dabei beweisen wir die eindeutige Lösbarkeit des zugrunde liegenden parabolisch-elliptischen direkten Problems und erhalten durch eine Erweiterung der Faktorisierungsmethode die eindeutige Rekonstruierbarkeit der Einschlüsse sowie einen numerischen Algorithmus zur praktischen Umsetzung der Methode.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We deal with five problems arising in the field of logistics: the Asymmetric TSP (ATSP), the TSP with Time Windows (TSPTW), the VRP with Time Windows (VRPTW), the Multi-Trip VRP (MTVRP), and the Two-Echelon Capacitated VRP (2E-CVRP). The ATSP requires finding a lest-cost Hamiltonian tour in a digraph. We survey models and classical relaxations, and describe the most effective exact algorithms from the literature. A survey and analysis of the polynomial formulations is provided. The considered algorithms and formulations are experimentally compared on benchmark instances. The TSPTW requires finding, in a weighted digraph, a least-cost Hamiltonian tour visiting each vertex within a given time window. We propose a new exact method, based on new tour relaxations and dynamic programming. Computational results on benchmark instances show that the proposed algorithm outperforms the state-of-the-art exact methods. In the VRPTW, a fleet of identical capacitated vehicles located at a depot must be optimally routed to supply customers with known demands and time window constraints. Different column generation bounding procedures and an exact algorithm are developed. The new exact method closed four of the five open Solomon instances. The MTVRP is the problem of optimally routing capacitated vehicles located at a depot to supply customers without exceeding maximum driving time constraints. Two set-partitioning-like formulations of the problem are introduced. Lower bounds are derived and embedded into an exact solution method, that can solve benchmark instances with up to 120 customers. The 2E-CVRP requires designing the optimal routing plan to deliver goods from a depot to customers by using intermediate depots. The objective is to minimize the sum of routing and handling costs. A new mathematical formulation is introduced. Valid lower bounds and an exact method are derived. Computational results on benchmark instances show that the new exact algorithm outperforms the state-of-the-art exact methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La diagnosi di linfoma non Hodgkin B della zona marginale si basa su criteri morfologici e sulla sostanziale negatività per marcatori immunoistochimici espressi in altri sottotipi di linfoma B. L’ obiettivo di questo lavoro è stato, quindi, quello di ricercare una molecola specifica associata ai linfomi della zona marginale. Materiali e Metodi. Sono stati esaminati 2.104 linfomi periferici di entità nosologia eterogenea mediante un anticorpo monoclonale, diretto contro la molecola IRTA1, che riconosce la zona marginale nei tessuti linfoidi umani. Risultati. Si è riscontrata espressione di IRTA1 nel 93% dei linfomi della zona marginale ad insorgenza extranodale e nel 74% di quelli primitivi linfonodali suggerendo la possibilità che questi linfomi possano originare dalle cellule perifollicolari o monocitoidi IRTA1+ riscontrabili nei linfonodi reattivi. La valutazione immunoistochimica mediante doppia colorazione (IRTA1/bcl6), ha inoltre dimostrato come vi sia una modulazione fenotipica nelle cellule marginali neoplastiche nel momento in cui esse colonizzano i follicoli linfoidi e durante la loro circolazione nei centri germinativi. Le cellule marginali neoplastiche che differenziano in senso plasmacellulare perdono l’ espressione di IRTA1 Discussione. In conclusione, tali evidenze hanno permesso di ampliare la conoscenza sulla biologia dei linfomi marginali e sottolineano come IRTA1 sia il primo marcatore diagnostico positivo per queste neoplasie.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Network Theory is a prolific and lively field, especially when it approaches Biology. New concepts from this theory find application in areas where extensive datasets are already available for analysis, without the need to invest money to collect them. The only tools that are necessary to accomplish an analysis are easily accessible: a computing machine and a good algorithm. As these two tools progress, thanks to technology advancement and human efforts, wider and wider datasets can be analysed. The aim of this paper is twofold. Firstly, to provide an overview of one of these concepts, which originates at the meeting point between Network Theory and Statistical Mechanics: the entropy of a network ensemble. This quantity has been described from different angles in the literature. Our approach tries to be a synthesis of the different points of view. The second part of the work is devoted to presenting a parallel algorithm that can evaluate this quantity over an extensive dataset. Eventually, the algorithm will also be used to analyse high-throughput data coming from biology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il lavoro di tesi nasce con l'intento di realizzare un'analisi unitaria della tematica dei ragazzi selvaggi al fine di evidenziare il valore e le ripercussioni di quanto oggigiorno può emergere dal portare avanti lo studio dei ragazzi selvaggi da una prospettiva pedagogico-educativa. L’espressione ragazzo selvaggio viene tradizionalmente usata per riferirsi a quei bambini/e cresciuti per un lungo periodo in compagnia di animali o in luoghi isolati. Nel corso del tempo altre accezioni sono comparse e l’utilizzo di tale espressione ha assunto significati diversi in base ai contesti di riferimento e alle eventuali connessioni con la tematica del selvaggio in generale. Questa tematica si colloca al crocevia tra più aree di ricerca; gli incroci sono come territori di confine, luoghi in cui è facile smarrirsi, ma sono anche luoghi di incontro, di confronto, di scambio, luoghi che conducono a nuova conoscenza. Affrontare, come in questo caso, un argomento di ricerca ricco di aspetti analizzabili da più punti di vista, implica la necessità di riferirsi a più prospettive d'indagine e di rapportarsi con differenti ambiti di studio. Ciò non deve però tradursi in un’inclusione indistinta e giustapponente delle diverse letture del fenomeno e degli elementi connessi alla ricerca. Il presente lavoro cerca pertanto di adottare un orientamento olistico alla tematica dei ragazzi selvaggi avvalendosi di uno sguardo di matrice pedagogico-educativa. L’obiettivo generale della ricerca si articola in tre sotto-obiettivi, identificabili nello specifico come: - realizzare un’analisi del fenomeno dei ragazzi selvaggi che metta in luce gli apporti della tematica all’evoluzione della storia della pedagogia; - far emergere ed evidenziare le connessioni della tematica a problematiche educative attualmente rilevanti ed oggetto del dibattito pedagogico contemporaneo; - analizzare i punti di contatto tra le ricerche sui feral children e le evidenze emerse dagli studi in campo neuroscientifico.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation mimics the Turkish college admission procedure. It started with the purpose to reduce the inefficiencies in Turkish market. For this purpose, we propose a mechanism under a new market structure; as we prefer to call, semi-centralization. In chapter 1, we give a brief summary of Matching Theory. We present the first examples in Matching history with the most general papers and mechanisms. In chapter 2, we propose our mechanism. In real life application, that is in Turkish university placements, the mechanism reduces the inefficiencies of the current system. The success of the mechanism depends on the preference profile. It is easy to show that under complete information the mechanism implements the full set of stable matchings for a given profile. In chapter 3, we refine our basic mechanism. The modification on the mechanism has a crucial effect on the results. The new mechanism is, as we call, a middle mechanism. In one of the subdomain, this mechanism coincides with the original basic mechanism. But, in the other partition, it gives the same results with Gale and Shapley's algorithm. In chapter 4, we apply our basic mechanism to well known Roommate Problem. Since the roommate problem is in one-sided game patern, firstly we propose an auxiliary function to convert the game semi centralized two-sided game, because our basic mechanism is designed for this framework. We show that this process is succesful in finding a stable matching in the existence of stability. We also show that our mechanism easily and simply tells us if a profile lacks of stability by using purified orderings. Finally, we show a method to find all the stable matching in the existence of multi stability. The method is simply to run the mechanism for all of the top agents in the social preference.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we investigate several phenomenologically important properties of top-quark pair production at hadron colliders. We calculate double differential cross sections in two different kinematical setups, pair invariant-mass (PIM) and single-particle inclusive (1PI) kinematics. In pair invariant-mass kinematics we are able to present results for the double differential cross section with respect to the invariant mass of the top-quark pair and the top-quark scattering angle. Working in the threshold region, where the pair invariant mass M is close to the partonic center-of-mass energy sqrt{hat{s}}, we are able to factorize the partonic cross section into different energy regions. We use renormalization-group (RG) methods to resum large threshold logarithms to next-to-next-to-leading-logarithmic (NNLL) accuracy. On a technical level this is done using effective field theories, such as heavy-quark effective theory (HQET) and soft-collinear effective theory (SCET). The same techniques are applied when working in 1PI kinematics, leading to a calculation of the double differential cross section with respect to transverse-momentum pT and the rapidity of the top quark. We restrict the phase-space such that only soft emission of gluons is possible, and perform a NNLL resummation of threshold logarithms. The obtained analytical expressions enable us to precisely predict several observables, and a substantial part of this thesis is devoted to their detailed phenomenological analysis. Matching our results in the threshold regions to the exact ones at next-to-leading order (NLO) in fixed-order perturbation theory, allows us to make predictions at NLO+NNLL order in RG-improved, and at approximate next-to-next-to-leading order (NNLO) in fixed order perturbation theory. We give numerical results for the invariant mass distribution of the top-quark pair, and for the top-quark transverse-momentum and rapidity spectrum. We predict the total cross section, separately for both kinematics. Using these results, we analyze subleading contributions to the total cross section in 1PI and PIM originating from power corrections to the leading terms in the threshold expansions, and compare them to previous approaches. We later combine our PIM and 1PI results for the total cross section, this way eliminating uncertainties due to these corrections. The combined predictions for the total cross section are presented as a function of the top-quark mass in the pole, the minimal-subtraction (MS), and the 1S mass scheme. In addition, we calculate the forward-backward (FB) asymmetry at the Tevatron in the laboratory, and in the ttbar rest frames as a function of the rapidity and the invariant mass of the top-quark pair at NLO+NNLL. We also give binned results for the asymmetry as a function of the invariant mass and the rapidity difference of the ttbar pair, and compare those to recent measurements. As a last application we calculate the charge asymmetry at the LHC as a function of a lower rapidity cut-off for the top and anti-top quarks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis aims at investigating a new approach to document analysis based on the idea of structural patterns in XML vocabularies. My work is founded on the belief that authors do naturally converge to a reasonable use of markup languages and that extreme, yet valid instances are rare and limited. Actual documents, therefore, may be used to derive classes of elements (patterns) persisting across documents and distilling the conceptualization of the documents and their components, and may give ground for automatic tools and services that rely on no background information (such as schemas) at all. The central part of my work consists in introducing from the ground up a formal theory of eight structural patterns (with three sub-patterns) that are able to express the logical organization of any XML document, and verifying their identifiability in a number of different vocabularies. This model is characterized by and validated against three main dimensions: terseness (i.e. the ability to represent the structure of a document with a small number of objects and composition rules), coverage (i.e. the ability to capture any possible situation in any document) and expressiveness (i.e. the ability to make explicit the semantics of structures, relations and dependencies). An algorithm for the automatic recognition of structural patterns is then presented, together with an evaluation of the results of a test performed on a set of more than 1100 documents from eight very different vocabularies. This language-independent analysis confirms the ability of patterns to capture and summarize the guidelines used by the authors in their everyday practice. Finally, I present some systems that work directly on the pattern-based representation of documents. The ability of these tools to cover very different situations and contexts confirms the effectiveness of the model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we study a model for the breast image reconstruction in Digital Tomosynthesis, that is a non-invasive and non-destructive method for the three-dimensional visualization of the inner structures of an object, in which the data acquisition includes measuring a limited number of low-dose two-dimensional projections of an object by moving a detector and an X-ray tube around the object within a limited angular range. The problem of reconstructing 3D images from the projections provided in the Digital Tomosynthesis is an ill-posed inverse problem, that leads to a minimization problem with an object function that contains a data fitting term and a regularization term. The contribution of this thesis is to use the techniques of the compressed sensing, in particular replacing the standard least squares problem of data fitting with the problem of minimizing the 1-norm of the residuals, and using as regularization term the Total Variation (TV). We tested two different algorithms: a new alternating minimization algorithm (ADM), and a version of the more standard scaled projected gradient algorithm (SGP) that involves the 1-norm. We perform some experiments and analyse the performance of the two methods comparing relative errors, iterations number, times and the qualities of the reconstructed images. In conclusion we noticed that the use of the 1-norm and the Total Variation are valid tools in the formulation of the minimization problem for the image reconstruction resulting from Digital Tomosynthesis and the new algorithm ADM has reached a relative error comparable to a version of the classic algorithm SGP and proved best in speed and in the early appearance of the structures representing the masses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abnormal activation of cellular DNA repair pathways by deregulated signaling of receptor tyrosine kinase systems has broad implications for both cancer biology and treatment. Recent studies suggest a potential link between DNA repair and aberrant activation of the hepatocyte growth factor receptor Mesenchymal-Epithelial Transition (MET), an oncogene that is overexpressed in numerous types of human tumors and considered a prime target in clinical oncology. Using the homologous recombination (HR) direct-repeat direct-repeat green fluorescent protein ((DR)-GFP) system, we show that MET inhibition in tumor cells with deregulated MET activity by the small molecule PHA665752 significantly impairs in a dose-dependent manner HR. Using cells that express MET-mutated variants that respond differentially to PHA665752, we confirm that the observed HR inhibition is indeed MET-dependent. Furthermore, our data also suggest that decline in HR-dependent DNA repair activity is not a secondary effect due to cell cycle alterations caused by PHA665752. Mechanistically, we show that MET inhibition affects the formation of the RAD51-BRCA2 complex, which is crucial for error-free HR repair of double strand DNA lesions, presumably via downregulation and impaired translocation of RAD51 into the nucleus. Taken together, these findings assist to further support the role of MET in the cellular DNA damage response and highlight the potential future benefit of MET inhibitors for the sensitization of tumor cells to DNA damaging agents.