479 resultados para regularization
Resumo:
Abstract
The goal of modern radiotherapy is to precisely deliver a prescribed radiation dose to delineated target volumes that contain a significant amount of tumor cells while sparing the surrounding healthy tissues/organs. Precise delineation of treatment and avoidance volumes is the key for the precision radiation therapy. In recent years, considerable clinical and research efforts have been devoted to integrate MRI into radiotherapy workflow motivated by the superior soft tissue contrast and functional imaging possibility. Dynamic contrast-enhanced MRI (DCE-MRI) is a noninvasive technique that measures properties of tissue microvasculature. Its sensitivity to radiation-induced vascular pharmacokinetic (PK) changes has been preliminary demonstrated. In spite of its great potential, two major challenges have limited DCE-MRI’s clinical application in radiotherapy assessment: the technical limitations of accurate DCE-MRI imaging implementation and the need of novel DCE-MRI data analysis methods for richer functional heterogeneity information.
This study aims at improving current DCE-MRI techniques and developing new DCE-MRI analysis methods for particular radiotherapy assessment. Thus, the study is naturally divided into two parts. The first part focuses on DCE-MRI temporal resolution as one of the key DCE-MRI technical factors, and some improvements regarding DCE-MRI temporal resolution are proposed; the second part explores the potential value of image heterogeneity analysis and multiple PK model combination for therapeutic response assessment, and several novel DCE-MRI data analysis methods are developed.
I. Improvement of DCE-MRI temporal resolution. First, the feasibility of improving DCE-MRI temporal resolution via image undersampling was studied. Specifically, a novel MR image iterative reconstruction algorithm was studied for DCE-MRI reconstruction. This algorithm was built on the recently developed compress sensing (CS) theory. By utilizing a limited k-space acquisition with shorter imaging time, images can be reconstructed in an iterative fashion under the regularization of a newly proposed total generalized variation (TGV) penalty term. In the retrospective study of brain radiosurgery patient DCE-MRI scans under IRB-approval, the clinically obtained image data was selected as reference data, and the simulated accelerated k-space acquisition was generated via undersampling the reference image full k-space with designed sampling grids. Two undersampling strategies were proposed: 1) a radial multi-ray grid with a special angular distribution was adopted to sample each slice of the full k-space; 2) a Cartesian random sampling grid series with spatiotemporal constraints from adjacent frames was adopted to sample the dynamic k-space series at a slice location. Two sets of PK parameters’ maps were generated from the undersampled data and from the fully-sampled data, respectively. Multiple quantitative measurements and statistical studies were performed to evaluate the accuracy of PK maps generated from the undersampled data in reference to the PK maps generated from the fully-sampled data. Results showed that at a simulated acceleration factor of four, PK maps could be faithfully calculated from the DCE images that were reconstructed using undersampled data, and no statistically significant differences were found between the regional PK mean values from undersampled and fully-sampled data sets. DCE-MRI acceleration using the investigated image reconstruction method has been suggested as feasible and promising.
Second, for high temporal resolution DCE-MRI, a new PK model fitting method was developed to solve PK parameters for better calculation accuracy and efficiency. This method is based on a derivative-based deformation of the commonly used Tofts PK model, which is presented as an integrative expression. This method also includes an advanced Kolmogorov-Zurbenko (KZ) filter to remove the potential noise effect in data and solve the PK parameter as a linear problem in matrix format. In the computer simulation study, PK parameters representing typical intracranial values were selected as references to simulated DCE-MRI data for different temporal resolution and different data noise level. Results showed that at both high temporal resolutions (<1s) and clinically feasible temporal resolution (~5s), this new method was able to calculate PK parameters more accurate than the current calculation methods at clinically relevant noise levels; at high temporal resolutions, the calculation efficiency of this new method was superior to current methods in an order of 102. In a retrospective of clinical brain DCE-MRI scans, the PK maps derived from the proposed method were comparable with the results from current methods. Based on these results, it can be concluded that this new method can be used for accurate and efficient PK model fitting for high temporal resolution DCE-MRI.
II. Development of DCE-MRI analysis methods for therapeutic response assessment. This part aims at methodology developments in two approaches. The first one is to develop model-free analysis method for DCE-MRI functional heterogeneity evaluation. This approach is inspired by the rationale that radiotherapy-induced functional change could be heterogeneous across the treatment area. The first effort was spent on a translational investigation of classic fractal dimension theory for DCE-MRI therapeutic response assessment. In a small-animal anti-angiogenesis drug therapy experiment, the randomly assigned treatment/control groups received multiple fraction treatments with one pre-treatment and multiple post-treatment high spatiotemporal DCE-MRI scans. In the post-treatment scan two weeks after the start, the investigated Rényi dimensions of the classic PK rate constant map demonstrated significant differences between the treatment and the control groups; when Rényi dimensions were adopted for treatment/control group classification, the achieved accuracy was higher than the accuracy from using conventional PK parameter statistics. Following this pilot work, two novel texture analysis methods were proposed. First, a new technique called Gray Level Local Power Matrix (GLLPM) was developed. It intends to solve the lack of temporal information and poor calculation efficiency of the commonly used Gray Level Co-Occurrence Matrix (GLCOM) techniques. In the same small animal experiment, the dynamic curves of Haralick texture features derived from the GLLPM had an overall better performance than the corresponding curves derived from current GLCOM techniques in treatment/control separation and classification. The second developed method is dynamic Fractal Signature Dissimilarity (FSD) analysis. Inspired by the classic fractal dimension theory, this method measures the dynamics of tumor heterogeneity during the contrast agent uptake in a quantitative fashion on DCE images. In the small animal experiment mentioned before, the selected parameters from dynamic FSD analysis showed significant differences between treatment/control groups as early as after 1 treatment fraction; in contrast, metrics from conventional PK analysis showed significant differences only after 3 treatment fractions. When using dynamic FSD parameters, the treatment/control group classification after 1st treatment fraction was improved than using conventional PK statistics. These results suggest the promising application of this novel method for capturing early therapeutic response.
The second approach of developing novel DCE-MRI methods is to combine PK information from multiple PK models. Currently, the classic Tofts model or its alternative version has been widely adopted for DCE-MRI analysis as a gold-standard approach for therapeutic response assessment. Previously, a shutter-speed (SS) model was proposed to incorporate transcytolemmal water exchange effect into contrast agent concentration quantification. In spite of richer biological assumption, its application in therapeutic response assessment is limited. It might be intriguing to combine the information from the SS model and from the classic Tofts model to explore potential new biological information for treatment assessment. The feasibility of this idea was investigated in the same small animal experiment. The SS model was compared against the Tofts model for therapeutic response assessment using PK parameter regional mean value comparison. Based on the modeled transcytolemmal water exchange rate, a biological subvolume was proposed and was automatically identified using histogram analysis. Within the biological subvolume, the PK rate constant derived from the SS model were proved to be superior to the one from Tofts model in treatment/control separation and classification. Furthermore, novel biomarkers were designed to integrate PK rate constants from these two models. When being evaluated in the biological subvolume, this biomarker was able to reflect significant treatment/control difference in both post-treatment evaluation. These results confirm the potential value of SS model as well as its combination with Tofts model for therapeutic response assessment.
In summary, this study addressed two problems of DCE-MRI application in radiotherapy assessment. In the first part, a method of accelerating DCE-MRI acquisition for better temporal resolution was investigated, and a novel PK model fitting algorithm was proposed for high temporal resolution DCE-MRI. In the second part, two model-free texture analysis methods and a multiple-model analysis method were developed for DCE-MRI therapeutic response assessment. The presented works could benefit the future DCE-MRI routine clinical application in radiotherapy assessment.
Resumo:
Development of reliable methods for optimised energy storage and generation is one of the most imminent challenges in modern power systems. In this paper an adaptive approach to load leveling problem using novel dynamic models based on the Volterra integral equations of the first kind with piecewise continuous kernels. These integral equations efficiently solve such inverse problem taking into account both the time dependent efficiencies and the availability of generation/storage of each energy storage technology. In this analysis a direct numerical method is employed to find the least-cost dispatch of available storages. The proposed collocation type numerical method has second order accuracy and enjoys self-regularization properties, which is associated with confidence levels of system demand. This adaptive approach is suitable for energy storage optimisation in real time. The efficiency of the proposed methodology is demonstrated on the Single Electricity Market of Republic of Ireland and Northern Ireland.
Resumo:
Inverse simulations of musculoskeletal models computes the internal forces such as muscle and joint reaction forces, which are hard to measure, using the more easily measured motion and external forces as input data. Because of the difficulties of measuring muscle forces and joint reactions, simulations are hard to validate. One way of reducing errors for the simulations is to ensure that the mathematical problem is well-posed. This paper presents a study of regularity aspects for an inverse simulation method, often called forward dynamics or dynamical optimization, that takes into account both measurement errors and muscle dynamics. The simulation method is explained in detail. Regularity is examined for a test problem around the optimum using the approximated quadratic problem. The results shows improved rank by including a regularization term in the objective that handles the mechanical over-determinancy. Using the 3-element Hill muscle model the chosen regularization term is the norm of the activation. To make the problem full-rank only the excitation bounds should be included in the constraints. However, this results in small negative values of the activation which indicates that muscles are pushing and not pulling. Despite this unrealistic behavior the error maybe small enough to be accepted for specific applications. These results is a starting point start for achieving better results of inverse musculoskeletal simulations from a numerical point of view.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
This booklet has been prepared from the actions for the university extension project entitled "Housing and Environment: building dialogue on urbanization of the settlement Ilha”. The study was conducted in a community located on the south end of the city of Almirante Tamandaré, among Barigui and Tanguá rivers. This project was conducted by a group of professors and students of the Federal University of Technology - Paraná (UTFPR), Campus Curitiba. The main objective was to investigate ways of intervention in housing and urbanization of the settlement Ilha, for the regularization of their properties. However, throughout the project, the group found that regularization of this settlement was not possible, in view of the risk of flooding on site. Therefore, this booklet provides information about the area and the rivers in their surroundings, on the positive aspects of living there, brings the story of some struggles of residents for better living conditions, as well as suggestions of funding sources for facilitating a possible relocation of existing families.
Resumo:
This book’s guideline is a description of the activities developed during the University Extension project entitled "Housing and Environment: building dialogue over the urbanization of the settlement Ilha", located in the Metropolitan Region of Curitiba. This project was coordinated by professors from the Federal University of Technology - Paraná (UTFPR). The initial objectives of the extension project were to investigate ways of intervention on the scenario of poor conditions of housing and urbanization of the settlement Ilha, for their land regularization. The book tells the story of the extension project, showing how the initial goals have changed with time. In addition, this book describes the frustrations and the learning process along the way, from the view of professors and students of UTFPR who actively participated in this project. This book also intends to report the feelings that the villagers attributed to their place of residence; the joys, stumbling and learning by using a participatory methodology from what Paulo Freire says about popular education. Moreover, the book brings the confrontation between the technical and popular vision on the regularization of the area.
Resumo:
This study presents research regarding affordable housing and their effects on the spatial reconfiguration of Natal/ RN, aiming to identify the specificities of the informality of urban land. This study aims to understand how informal housing market operates housing provision for the population located in popular informal settlements, through buying and selling market and rental market of residential properties irregular / illegal. This understanding will be through the neighborhood of Mãe Luisa, Special Area of Social Interest (SASI), located between neighborhoods with a population of high purchasing power and inserted into the tourist shaft of seaside of town. The characterization of informal housing market in Mãe Luiza, from buyers, sellers and renters, will help to understand how these informal transactions operate on SASI and housing provision for public policy development and implementation of housing programs and land regularization for low-income population, adequate to dynamic and reality of housing of informal areas
Resumo:
With the dramatic growth of text information, there is an increasing need for powerful text mining systems that can automatically discover useful knowledge from text. Text is generally associated with all kinds of contextual information. Those contexts can be explicit, such as the time and the location where a blog article is written, and the author(s) of a biomedical publication, or implicit, such as the positive or negative sentiment that an author had when she wrote a product review; there may also be complex context such as the social network of the authors. Many applications require analysis of topic patterns over different contexts. For instance, analysis of search logs in the context of the user can reveal how we can improve the quality of a search engine by optimizing the search results according to particular users; analysis of customer reviews in the context of positive and negative sentiments can help the user summarize public opinions about a product; analysis of blogs or scientific publications in the context of a social network can facilitate discovery of more meaningful topical communities. Since context information significantly affects the choices of topics and language made by authors, in general, it is very important to incorporate it into analyzing and mining text data. In general, modeling the context in text, discovering contextual patterns of language units and topics from text, a general task which we refer to as Contextual Text Mining, has widespread applications in text mining. In this thesis, we provide a novel and systematic study of contextual text mining, which is a new paradigm of text mining treating context information as the ``first-class citizen.'' We formally define the problem of contextual text mining and its basic tasks, and propose a general framework for contextual text mining based on generative modeling of text. This conceptual framework provides general guidance on text mining problems with context information and can be instantiated into many real tasks, including the general problem of contextual topic analysis. We formally present a functional framework for contextual topic analysis, with a general contextual topic model and its various versions, which can effectively solve the text mining problems in a lot of real world applications. We further introduce general components of contextual topic analysis, by adding priors to contextual topic models to incorporate prior knowledge, regularizing contextual topic models with dependency structure of context, and postprocessing contextual patterns to extract refined patterns. The refinements on the general contextual topic model naturally lead to a variety of probabilistic models which incorporate different types of context and various assumptions and constraints. These special versions of the contextual topic model are proved effective in a variety of real applications involving topics and explicit contexts, implicit contexts, and complex contexts. We then introduce a postprocessing procedure for contextual patterns, by generating meaningful labels for multinomial context models. This method provides a general way to interpret text mining results for real users. By applying contextual text mining in the ``context'' of other text information management tasks, including ad hoc text retrieval and web search, we further prove the effectiveness of contextual text mining techniques in a quantitative way with large scale datasets. The framework of contextual text mining not only unifies many explorations of text analysis with context information, but also opens up many new possibilities for future research directions in text mining.
O descompasso de uma experiência: avaliação do Programa Habitar Brasil na Comunidade África-Natal/RN
Resumo:
This research deals with the evaluation of the Programa do Governo Federal para Urbanização de Favelas Habitar Brasil(1993) carried out in the Africa slum - Redinha neighbourhood in Natal-Rn. This study carried out in period from 2005 to 2006 searches to identify the effects of the actions proposed by Program in 1993-1994 about the current urbanistic configuration of the Africa community. It analyzes the effectiveness in the process of achievement of the considered objectives to habitation, communitity equipments, infrastructure and agrarian regularization. On the evaluation process, it has been as reference the works developed by Adauto Cardoso (2004), Blaine Worthen (2004), Ronaldo Garcia (2001) and Rosângela Paz (2006). About the Habitational Policy with approach to the Urbanistic Right and the right to the housing, the reflections by Raquel Rolnik, Nabil Bonduki, Ermínia Maricato, Saule Júnior, Betânia de Moraes Alfonsin and Edésio Fernandes are main references. To gauge the execution of the objectives proposed by Habitar Brasil in 1993, it has searched in the documentary data of the time and in information gotten in interviews with technicians that had participated of the program, consistent references on what was considered, what was executed and the process of the intervention of Habitar Brasil in the Africa community. The area analysis in 2005-2006 has developed on the base of the urbanistic survey of the current situation from the four performance lines of the Program: habitation, infrastructure, community equipments and agrarian regularization, with a current urbanistic evaluation of Africa considering the intervention carried out in 1993 and 1994. The study points out the context of Brazilian Habitational Policy where the Programa Habitar Brasil was launched, explaining the main principles of the Program. In terms of local, it empahsizes the administrative-political factors that had contributed so that Natal-Rn city has been pioneering in the resources captation of Habitar Brazil (1993). Considering Habitar Brazil in Africa, the work argues and presents the intervention diagnosis and the proposal, developed by Program in 1993 evidencing the local problem of the time. After that, it makes a current reading of the area, identifying in 2006 representative elements of Habitar Brasil (1993-1994) for the Africa community. It identifies significant advances in the constitution of the institucional apparatus of the plaining system of Habitation of Social Interest for the city of Natal and points the fragilities in the implementation of the urban infrastructure actions and above all in the achievement of the objectives of the agrarian regularization
Resumo:
The present research if inserts in the subject of the habitation of social interest and its relation with the sanitation infra-structure questions (sewer, water, draining and garbage). Having as study universe the narrow river of the Forty , situated one in the city of Manaus, capital of Amazon, approaches questions that if present between the necessities of housing and the especificidades of the natural environment, whose characteristics evidence limits for the implantation of adequate habitations. The objective is to analyze the possibilities and the limits of the urbanística regularization in the palafitas of the narrow rivers of Manaus, in view of the factors of habitability and ambient protection, expresses for the sanitation system - sanitary exhaustion, water supply, urban draining and garbage collection. The work approaches initially relative the conceptual aspects to the subject of social habitation in the country and its relation with the habitability factors, also focusing the question of the housing and the processes of urban informality in the city of Manaus. It deals with the process of constitution of the palafitas in the space of the city and its relation with the habitacionais politics, presenting the analysis of the implantation of the palafitas in relation to the sanitation infra-structure conditions (sewer, water, draining and garbage). As conclusion, it identifies to the possibilities and limits of urbanística regularization of the palafitas implanted to the long one of the narrow river of the Forty , taking in consideration the systems of the sanitation infrastructure
Resumo:
This thesis deals with tensor completion for the solution of multidimensional inverse problems. We study the problem of reconstructing an approximately low rank tensor from a small number of noisy linear measurements. New recovery guarantees, numerical algorithms, non-uniform sampling strategies, and parameter selection algorithms are developed. We derive a fixed point continuation algorithm for tensor completion and prove its convergence. A restricted isometry property (RIP) based tensor recovery guarantee is proved. Probabilistic recovery guarantees are obtained for sub-Gaussian measurement operators and for measurements obtained by non-uniform sampling from a Parseval tight frame. We show how tensor completion can be used to solve multidimensional inverse problems arising in NMR relaxometry. Algorithms are developed for regularization parameter selection, including accelerated k-fold cross-validation and generalized cross-validation. These methods are validated on experimental and simulated data. We also derive condition number estimates for nonnegative least squares problems. Tensor recovery promises to significantly accelerate N-dimensional NMR relaxometry and related experiments, enabling previously impractical experiments. Our methods could also be applied to other inverse problems arising in machine learning, image processing, signal processing, computer vision, and other fields.
Resumo:
We present a detailed analysis of the application of a multi-scale Hierarchical Reconstruction method for solving a family of ill-posed linear inverse problems. When the observations on the unknown quantity of interest and the observation operators are known, these inverse problems are concerned with the recovery of the unknown from its observations. Although the observation operators we consider are linear, they are inevitably ill-posed in various ways. We recall in this context the classical Tikhonov regularization method with a stabilizing function which targets the specific ill-posedness from the observation operators and preserves desired features of the unknown. Having studied the mechanism of the Tikhonov regularization, we propose a multi-scale generalization to the Tikhonov regularization method, so-called the Hierarchical Reconstruction (HR) method. First introduction of the HR method can be traced back to the Hierarchical Decomposition method in Image Processing. The HR method successively extracts information from the previous hierarchical residual to the current hierarchical term at a finer hierarchical scale. As the sum of all the hierarchical terms, the hierarchical sum from the HR method provides an reasonable approximate solution to the unknown, when the observation matrix satisfies certain conditions with specific stabilizing functions. When compared to the Tikhonov regularization method on solving the same inverse problems, the HR method is shown to be able to decrease the total number of iterations, reduce the approximation error, and offer self control of the approximation distance between the hierarchical sum and the unknown, thanks to using a ladder of finitely many hierarchical scales. We report numerical experiments supporting our claims on these advantages the HR method has over the Tikhonov regularization method.
Resumo:
Visual recognition is a fundamental research topic in computer vision. This dissertation explores datasets, features, learning, and models used for visual recognition. In order to train visual models and evaluate different recognition algorithms, this dissertation develops an approach to collect object image datasets on web pages using an analysis of text around the image and of image appearance. This method exploits established online knowledge resources (Wikipedia pages for text; Flickr and Caltech data sets for images). The resources provide rich text and object appearance information. This dissertation describes results on two datasets. The first is Berg’s collection of 10 animal categories; on this dataset, we significantly outperform previous approaches. On an additional set of 5 categories, experimental results show the effectiveness of the method. Images are represented as features for visual recognition. This dissertation introduces a text-based image feature and demonstrates that it consistently improves performance on hard object classification problems. The feature is built using an auxiliary dataset of images annotated with tags, downloaded from the Internet. Image tags are noisy. The method obtains the text features of an unannotated image from the tags of its k-nearest neighbors in this auxiliary collection. A visual classifier presented with an object viewed under novel circumstances (say, a new viewing direction) must rely on its visual examples. This text feature may not change, because the auxiliary dataset likely contains a similar picture. While the tags associated with images are noisy, they are more stable when appearance changes. The performance of this feature is tested using PASCAL VOC 2006 and 2007 datasets. This feature performs well; it consistently improves the performance of visual object classifiers, and is particularly effective when the training dataset is small. With more and more collected training data, computational cost becomes a bottleneck, especially when training sophisticated classifiers such as kernelized SVM. This dissertation proposes a fast training algorithm called Stochastic Intersection Kernel Machine (SIKMA). This proposed training method will be useful for many vision problems, as it can produce a kernel classifier that is more accurate than a linear classifier, and can be trained on tens of thousands of examples in two minutes. It processes training examples one by one in a sequence, so memory cost is no longer the bottleneck to process large scale datasets. This dissertation applies this approach to train classifiers of Flickr groups with many group training examples. The resulting Flickr group prediction scores can be used to measure image similarity between two images. Experimental results on the Corel dataset and a PASCAL VOC dataset show the learned Flickr features perform better on image matching, retrieval, and classification than conventional visual features. Visual models are usually trained to best separate positive and negative training examples. However, when recognizing a large number of object categories, there may not be enough training examples for most objects, due to the intrinsic long-tailed distribution of objects in the real world. This dissertation proposes an approach to use comparative object similarity. The key insight is that, given a set of object categories which are similar and a set of categories which are dissimilar, a good object model should respond more strongly to examples from similar categories than to examples from dissimilar categories. This dissertation develops a regularized kernel machine algorithm to use this category dependent similarity regularization. Experiments on hundreds of categories show that our method can make significant improvement for categories with few or even no positive examples.
Resumo:
Freshwater mussel (Mollusca, Bivalvia, Unionoida) populations are one of the most endangered faunistic groups. Mussels play an important role in the functioning of aquatic ecosystems, because they are responsible for the filtration and purification of water. They have a complex life cycle, with a parasitic larvae and usually limited host fish species. The real status of these populations is still poorly understood worldwide. The objectives of the present work were the study of bioecology of duck mussel (Anodonta anatina L.) populations of Tua Basin (NE Portugal). It was made the characterization of the ecological status of Rabaçal, Tuela and Tua Rivers, selecting 15 sampling sites, equally distributed by the three rivers. Samplings were made in the winter of 2016, and several physico-chemical water parameters measured and two habitat quality indexes calculated (GQC and QBR indexes). Benthic macroinvertebrate communities were sampled based on the protocols established by the Water Framework Directive. Host fish populations for duck mussel were determined in laboratorial conditions, testing several native and exotic fish species. The results showed that several water quality variables (e.g. dissolved oxygen, conductivity, pH, total dissolved solids, and nutrients) can be used for the classification of river typology. Other responsive metrics were also determined to identify environmental degradation. For instances, hydromorphological conditions (GQC and QBR indexes) and biota related metrics (e.g. composition, distribution, abundance, diversity of invertebrate communities) contributed to the evaluation of the ecological integrity. The upper zones of Rabaçal and Tuela rivers were classified with excellent and good ecological integrity, while less quality was found in downstream zones. The host fish tests showed that only native species are effective hosts, essential for the conservation purposes of this mussel species. Threats, like pollution, sedimentation and river regularization (3 big dams are in construction or in filling phase), are the main cause of habitat loss for native mussel and fish populations in the future. Rehabilitation and mitigation measures are essential for these lotic ecosystems in order to preserve the prioritary habitats and the native species heavily threatened.