934 resultados para Fractal time-space
Resumo:
Abstract
The goal of modern radiotherapy is to precisely deliver a prescribed radiation dose to delineated target volumes that contain a significant amount of tumor cells while sparing the surrounding healthy tissues/organs. Precise delineation of treatment and avoidance volumes is the key for the precision radiation therapy. In recent years, considerable clinical and research efforts have been devoted to integrate MRI into radiotherapy workflow motivated by the superior soft tissue contrast and functional imaging possibility. Dynamic contrast-enhanced MRI (DCE-MRI) is a noninvasive technique that measures properties of tissue microvasculature. Its sensitivity to radiation-induced vascular pharmacokinetic (PK) changes has been preliminary demonstrated. In spite of its great potential, two major challenges have limited DCE-MRI’s clinical application in radiotherapy assessment: the technical limitations of accurate DCE-MRI imaging implementation and the need of novel DCE-MRI data analysis methods for richer functional heterogeneity information.
This study aims at improving current DCE-MRI techniques and developing new DCE-MRI analysis methods for particular radiotherapy assessment. Thus, the study is naturally divided into two parts. The first part focuses on DCE-MRI temporal resolution as one of the key DCE-MRI technical factors, and some improvements regarding DCE-MRI temporal resolution are proposed; the second part explores the potential value of image heterogeneity analysis and multiple PK model combination for therapeutic response assessment, and several novel DCE-MRI data analysis methods are developed.
I. Improvement of DCE-MRI temporal resolution. First, the feasibility of improving DCE-MRI temporal resolution via image undersampling was studied. Specifically, a novel MR image iterative reconstruction algorithm was studied for DCE-MRI reconstruction. This algorithm was built on the recently developed compress sensing (CS) theory. By utilizing a limited k-space acquisition with shorter imaging time, images can be reconstructed in an iterative fashion under the regularization of a newly proposed total generalized variation (TGV) penalty term. In the retrospective study of brain radiosurgery patient DCE-MRI scans under IRB-approval, the clinically obtained image data was selected as reference data, and the simulated accelerated k-space acquisition was generated via undersampling the reference image full k-space with designed sampling grids. Two undersampling strategies were proposed: 1) a radial multi-ray grid with a special angular distribution was adopted to sample each slice of the full k-space; 2) a Cartesian random sampling grid series with spatiotemporal constraints from adjacent frames was adopted to sample the dynamic k-space series at a slice location. Two sets of PK parameters’ maps were generated from the undersampled data and from the fully-sampled data, respectively. Multiple quantitative measurements and statistical studies were performed to evaluate the accuracy of PK maps generated from the undersampled data in reference to the PK maps generated from the fully-sampled data. Results showed that at a simulated acceleration factor of four, PK maps could be faithfully calculated from the DCE images that were reconstructed using undersampled data, and no statistically significant differences were found between the regional PK mean values from undersampled and fully-sampled data sets. DCE-MRI acceleration using the investigated image reconstruction method has been suggested as feasible and promising.
Second, for high temporal resolution DCE-MRI, a new PK model fitting method was developed to solve PK parameters for better calculation accuracy and efficiency. This method is based on a derivative-based deformation of the commonly used Tofts PK model, which is presented as an integrative expression. This method also includes an advanced Kolmogorov-Zurbenko (KZ) filter to remove the potential noise effect in data and solve the PK parameter as a linear problem in matrix format. In the computer simulation study, PK parameters representing typical intracranial values were selected as references to simulated DCE-MRI data for different temporal resolution and different data noise level. Results showed that at both high temporal resolutions (<1s) and clinically feasible temporal resolution (~5s), this new method was able to calculate PK parameters more accurate than the current calculation methods at clinically relevant noise levels; at high temporal resolutions, the calculation efficiency of this new method was superior to current methods in an order of 102. In a retrospective of clinical brain DCE-MRI scans, the PK maps derived from the proposed method were comparable with the results from current methods. Based on these results, it can be concluded that this new method can be used for accurate and efficient PK model fitting for high temporal resolution DCE-MRI.
II. Development of DCE-MRI analysis methods for therapeutic response assessment. This part aims at methodology developments in two approaches. The first one is to develop model-free analysis method for DCE-MRI functional heterogeneity evaluation. This approach is inspired by the rationale that radiotherapy-induced functional change could be heterogeneous across the treatment area. The first effort was spent on a translational investigation of classic fractal dimension theory for DCE-MRI therapeutic response assessment. In a small-animal anti-angiogenesis drug therapy experiment, the randomly assigned treatment/control groups received multiple fraction treatments with one pre-treatment and multiple post-treatment high spatiotemporal DCE-MRI scans. In the post-treatment scan two weeks after the start, the investigated Rényi dimensions of the classic PK rate constant map demonstrated significant differences between the treatment and the control groups; when Rényi dimensions were adopted for treatment/control group classification, the achieved accuracy was higher than the accuracy from using conventional PK parameter statistics. Following this pilot work, two novel texture analysis methods were proposed. First, a new technique called Gray Level Local Power Matrix (GLLPM) was developed. It intends to solve the lack of temporal information and poor calculation efficiency of the commonly used Gray Level Co-Occurrence Matrix (GLCOM) techniques. In the same small animal experiment, the dynamic curves of Haralick texture features derived from the GLLPM had an overall better performance than the corresponding curves derived from current GLCOM techniques in treatment/control separation and classification. The second developed method is dynamic Fractal Signature Dissimilarity (FSD) analysis. Inspired by the classic fractal dimension theory, this method measures the dynamics of tumor heterogeneity during the contrast agent uptake in a quantitative fashion on DCE images. In the small animal experiment mentioned before, the selected parameters from dynamic FSD analysis showed significant differences between treatment/control groups as early as after 1 treatment fraction; in contrast, metrics from conventional PK analysis showed significant differences only after 3 treatment fractions. When using dynamic FSD parameters, the treatment/control group classification after 1st treatment fraction was improved than using conventional PK statistics. These results suggest the promising application of this novel method for capturing early therapeutic response.
The second approach of developing novel DCE-MRI methods is to combine PK information from multiple PK models. Currently, the classic Tofts model or its alternative version has been widely adopted for DCE-MRI analysis as a gold-standard approach for therapeutic response assessment. Previously, a shutter-speed (SS) model was proposed to incorporate transcytolemmal water exchange effect into contrast agent concentration quantification. In spite of richer biological assumption, its application in therapeutic response assessment is limited. It might be intriguing to combine the information from the SS model and from the classic Tofts model to explore potential new biological information for treatment assessment. The feasibility of this idea was investigated in the same small animal experiment. The SS model was compared against the Tofts model for therapeutic response assessment using PK parameter regional mean value comparison. Based on the modeled transcytolemmal water exchange rate, a biological subvolume was proposed and was automatically identified using histogram analysis. Within the biological subvolume, the PK rate constant derived from the SS model were proved to be superior to the one from Tofts model in treatment/control separation and classification. Furthermore, novel biomarkers were designed to integrate PK rate constants from these two models. When being evaluated in the biological subvolume, this biomarker was able to reflect significant treatment/control difference in both post-treatment evaluation. These results confirm the potential value of SS model as well as its combination with Tofts model for therapeutic response assessment.
In summary, this study addressed two problems of DCE-MRI application in radiotherapy assessment. In the first part, a method of accelerating DCE-MRI acquisition for better temporal resolution was investigated, and a novel PK model fitting algorithm was proposed for high temporal resolution DCE-MRI. In the second part, two model-free texture analysis methods and a multiple-model analysis method were developed for DCE-MRI therapeutic response assessment. The presented works could benefit the future DCE-MRI routine clinical application in radiotherapy assessment.
Resumo:
Multi-output Gaussian processes provide a convenient framework for multi-task problems. An illustrative and motivating example of a multi-task problem is multi-region electrophysiological time-series data, where experimentalists are interested in both power and phase coherence between channels. Recently, the spectral mixture (SM) kernel was proposed to model the spectral density of a single task in a Gaussian process framework. This work develops a novel covariance kernel for multiple outputs, called the cross-spectral mixture (CSM) kernel. This new, flexible kernel represents both the power and phase relationship between multiple observation channels. The expressive capabilities of the CSM kernel are demonstrated through implementation of 1) a Bayesian hidden Markov model, where the emission distribution is a multi-output Gaussian process with a CSM covariance kernel, and 2) a Gaussian process factor analysis model, where factor scores represent the utilization of cross-spectral neural circuits. Results are presented for measured multi-region electrophysiological data.
Resumo:
In this thesis we study aspects of (0,2) superconformal field theories (SCFTs), which are suitable for compactification of the heterotic string. In the first part, we study a class of (2,2) SCFTs obtained by fibering a Landau-Ginzburg (LG) orbifold CFT over a compact K\"ahler base manifold. While such models are naturally obtained as phases in a gauged linear sigma model (GLSM), our construction is independent of such an embedding. We discuss the general properties of such theories and present a technique to study the massless spectrum of the associated heterotic compactification. We test the validity of our method by applying it to hybrid phases of GLSMs and comparing spectra among the phases. In the second part, we turn to the study of the role of accidental symmetries in two-dimensional (0,2) SCFTs obtained by RG flow from (0,2) LG theories. These accidental symmetries are ubiquitous, and, unlike in the case of (2,2) theories, their identification is key to correctly identifying the IR fixed point and its properties. We develop a number of tools that help to identify such accidental symmetries in the context of (0,2) LG models and provide a conjecture for a toric structure of the SCFT moduli space in a large class of models. In the final part, we study the stability of heterotic compactifications described by (0,2) GLSMs with respect to worldsheet instanton corrections to the space-time superpotential following the work of Beasley and Witten. We show that generic models elude the vanishing theorem proved there, and may not determine supersymmetric heterotic vacua. We then construct a subclass of GLSMs for which a vanishing theorem holds.
Resumo:
This is a study of women's magazine consumption in the home. It explores issues of time and space, and addresses the importance the women who took part in the study place on magazine consumption in their lives, given the 'juggling' lifestyles experiences by most of them. The study reveals family life to be a landscape within which these women carve out what they perceive as valuable and rare time and space for themselves. The authors argue that in contemporary life women's magazines play a key part in the quest for me-time and time away from others, in both a tangible and experiential sense.
Resumo:
SELECTOR is a software package for studying the evolution of multiallelic genes under balancing or positive selection while simulating complex evolutionary scenarios that integrate demographic growth and migration in a spatially explicit population framework. Parameters can be varied both in space and time to account for geographical, environmental, and cultural heterogeneity. SELECTOR can be used within an approximate Bayesian computation estimation framework. We first describe the principles of SELECTOR and validate the algorithms by comparing its outputs for simple models with theoretical expectations. Then, we show how it can be used to investigate genetic differentiation of loci under balancing selection in interconnected demes with spatially heterogeneous gene flow. We identify situations in which balancing selection reduces genetic differentiation between population groups compared with neutrality and explain conflicting outcomes observed for human leukocyte antigen loci. These results and three previously published applications demonstrate that SELECTOR is efficient and robust for building insight into human settlement history and evolution.
Colonialism, political unconscious and cognitive mapping in the space of the film "Captain Phillips"
Resumo:
The purpose of this article has been made through a Marxist analysis of the US film "Captain Phillips" (PaulGreengrass, 2013), based on a true story. I have found how the evolution of capitalism in the West continuesto consolidate the belief reified in a historical and geographical superiority of the political and socioeconomicwestern models regarding Africa and Asia lowers models. At the same time, through categories like dialecticalmaterialism, criticism of diffusionist theory and application of cognitive mapping to large geopoliticalspaces located in most poor areas of the world, I have realized a remark about currently being articulatingthe political unconscious of working class in rich countries and the poor in poor countries, establishing arelationship between the ideological representation that takes an individual from his historical reality (ona scale that moves from local to global), and how he has developed a mental ability to escape of the responsibilityto make a critical review of what's happening around him in all areas. Finally, through physicalspace captured in the film, I have realized a materialist critique of globalized business process that takesplace through the carriage of goods, outlining spatial and cognitively limits of the mentality of our time, bothamong "winners"as among the "losers", based on the spatial movement of capital.
Resumo:
This chapter summarises some of the learning from a material practice that sits in a sisterly manner next to architecture. Drawing on feminist writing and the experiences of women in professional life more generally, the chapter will examine how mainstream understanding of time and technology limit the engagement of those people in society who do not fit given norms. The chapter argues that when we examine such concepts in more detail and expand them to reflect diverse experiences those very same concepts offer new potentials and innovative openings for the progression of disciplines such as architecture.
Resumo:
Embora tenha sido proposto que a vasculatura retínica apresenta estrutura fractal, nenhuma padronização do método de segmentação ou do método de cálculo das dimensões fractais foi realizada. Este estudo objetivou determinar se a estimação das dimensões fractais da vasculatura retínica é dependente dos métodos de segmentação vascular e dos métodos de cálculo de dimensão. Métodos: Dez imagens retinográficas foram segmentadas para extrair suas árvores vasculares por quatro métodos computacionais (“multithreshold”, “scale-space”, “pixel classification” e “ridge based detection”). Suas dimensões fractais de “informação”, de “massa-raio” e “por contagem de caixas” foram então calculadas e comparadas com as dimensões das mesmas árvores vasculares, quando obtidas pela segmentação manual (padrão áureo). Resultados: As médias das dimensões fractais variaram através dos grupos de diferentes métodos de segmentação, de 1,39 a 1,47 para a dimensão por contagem de caixas, de 1,47 a 1,52 para a dimensão de informação e de 1,48 a 1,57 para a dimensão de massa-raio. A utilização de diferentes métodos computacionais de segmentação vascular, bem como de diferentes métodos de cálculo de dimensão, introduziu diferença estatisticamente significativa nos valores das dimensões fractais das árvores vasculares. Conclusão: A estimação das dimensões fractais da vasculatura retínica foi dependente tanto dos métodos de segmentação vascular, quanto dos métodos de cálculo de dimensão utilizados
Resumo:
This is the first time a multidisciplinary team has employed an iterative co-design method to determine the ergonomic layout of an emergency ambulance treatment space. This process allowed the research team to understand how treatment protocols were performed and developed analytical tools to reach an optimum configuration towards ambulance design standardisation. Fusari conducted participatory observations during 12-hour shifts with front-line ambulance clinicians, hospital staff and patients to understand the details of their working environments whilst on response to urgent and emergency calls. A simple yet accurate 1:1 mock-up of the existing ambulance was built for detailed analysis of these procedures through simulations. Paramedics were called in to participate in interviews and role-playing inside the model to recreate tasks, how they are performed, the equipment used and to understand the limitations of the current ambulance. The use of Link Analysis distilled 5 modes of use. In parallel, an exhaustive audit of all equipment and consumables used in ambulances was performed (logging and photography) to define space use. These developed 12 layout options for refinement and CAD modelling and presented back to paramedics. The preferred options and features were then developed into a full size test rig and appearance model. Two key studies informed the process. The 2005 National Patient Safety Agency funded study “Future Ambulances” outlined 9 design challenges for future standardisation of emergency vehicles and equipment. Secondly, the 2007 EPSRC funded “Smart Pods” project investigated a new system of mobile urgent and emergency medicine to treat patients in the community. A full-size mobile demonstrator unit featuring the evidence-based ergonomic layout was built for clinical tests through simulated emergency scenarios. Results from clinical trials clearly show that the new layout improves infection control, speeds up treatment, and makes it easier for ambulance crews to follow correct clinical protocols.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Americans are accustomed to a wide range of data collection in their lives: census, polls, surveys, user registrations, and disclosure forms. When logging onto the Internet, users’ actions are being tracked everywhere: clicking, typing, tapping, swiping, searching, and placing orders. All of this data is stored to create data-driven profiles of each user. Social network sites, furthermore, set the voluntarily sharing of personal data as the default mode of engagement. But people’s time and energy devoted to creating this massive amount of data, on paper and online, are taken for granted. Few people would consider their time and energy spent on data production as labor. Even if some people do acknowledge their labor for data, they believe it is accessory to the activities at hand. In the face of pervasive data collection and the rising time spent on screens, why do people keep ignoring their labor for data? How has labor for data been become invisible, as something that is disregarded by many users? What does invisible labor for data imply for everyday cultural practices in the United States? Invisible Labor for Data addresses these questions. I argue that three intertwined forces contribute to framing data production as being void of labor: data production institutions throughout history, the Internet’s technological infrastructure (especially with the implementation of algorithms), and the multiplication of virtual spaces. There is a common tendency in the framework of human interactions with computers to deprive data and bodies of their materiality. My Introduction and Chapter 1 offer theoretical interventions by reinstating embodied materiality and redefining labor for data as an ongoing process. The middle Chapters present case studies explaining how labor for data is pushed to the margin of the narratives about data production. I focus on a nationwide debate in the 1960s on whether the U.S. should build a databank, contemporary Big Data practices in the data broker and the Internet industries, and the group of people who are hired to produce data for other people’s avatars in the virtual games. I conclude with a discussion on how the new development of crowdsourcing projects may usher in the new chapter in exploiting invisible and discounted labor for data.
Resumo:
This work presents a periodic state space model to model monthly temperature data. Additionally, some issues are discussed, as the parameter estimation or the Kalman filter recursions adapted to a periodic model. This framework is applied to monthly long-term temperature time series of Lisbon.
Resumo:
A structural time series model is one which is set up in terms of components which have a direct interpretation. In this paper, the discussion focuses on the dynamic modeling procedure based on the state space approach (associated to the Kalman filter), in the context of surface water quality monitoring, in order to analyze and evaluate the temporal evolution of the environmental variables, and thus identify trends or possible changes in water quality (change point detection). The approach is applied to environmental time series: time series of surface water quality variables in a river basin. The statistical modeling procedure is applied to monthly values of physico- chemical variables measured in a network of 8 water monitoring sites over a 15-year period (1999-2014) in the River Ave hydrological basin located in the Northwest region of Portugal.
Resumo:
Brazil is home to one of the richest avifaunas the world, which is subject to high levels of environmental degradation, in particular forest fragmentation. The Atlantic Forest biome depicts this history of devastation and today remains as small isolated fragments on highly degraded landscapes. This project aimed to evaluate the effects of forest fragmentation in an area with Atlantic Forest remnants in northern Paraná (Brazil) on the distribution and the organization of assemblage of forest birds and tested the hypothesis that the structure of the assembly in the fragments is different than expected by chance. We did four qualitative samplings of birds in three sets of forest fragments in the landscape, each with three fragments: large, medium and small. The method applied in the sampling was point counts along transects, traveled randomly for four hours in each fragment. Samples were taken in two periods: from September to November / 2013, and between March and May / 2014. The structure of the meeting was assessed by rates of co-occurring species (Checkerboard and CScore) and α diversity patterns (wealth) and β (turnover of species), while the landscape structure was analyzed from the parameters: area, distance between fragments, fractal dimension, edge density, fragment shape index and nuclear area index. The null hypothesis of no structure in the assembly of birds in the landscape was tested with null models from the co-occurrence indexes. The effects of landscape structure on the assembly of the structure were analyzed by the Mantel test and principal component analysis (PCA). The assembly of the structure in the landscape showed a pattern of spatiotemporal organization significantly different from that expected by chance, revealing a structure most influenced by segregation of the species. The fragments showed significant differences in richness, unlike sets of fragments, indicating relative homogeneity in the landscape structure. The differences between the size and the distance between the fragments significantly influenced the patterns of organization of the meeting of forest birds in the landscape and patterns of α and β diversity, indicating that the higher the fragment and smaller distances between them, more the standard of species cooccurrence is different than expected by chance. Thus, the fragmented landscape of remnants of the northern Paraná Atlantic Forest still has availability of environmental resources and physical characteristics that allow a persistent organizational structure of the assembly of forest birds in space over time.
Decoherence models for discrete-time quantum walks and their application to neutral atom experiments
Resumo:
We discuss decoherence in discrete-time quantum walks in terms of a phenomenological model that distinguishes spin and spatial decoherence. We identify the dominating mechanisms that affect quantum-walk experiments realized with neutral atoms walking in an optical lattice. From the measured spatial distributions, we determine with good precision the amount of decoherence per step, which provides a quantitative indication of the quality of our quantum walks. In particular, we find that spin decoherence is the main mechanism responsible for the loss of coherence in our experiment. We also find that the sole observation of ballistic-instead of diffusive-expansion in position space is not a good indicator of the range of coherent delocalization. We provide further physical insight by distinguishing the effects of short- and long-time spin dephasing mechanisms. We introduce the concept of coherence length in the discrete-time quantum walk, which quantifies the range of spatial coherences. Unexpectedly, we find that quasi-stationary dephasing does not modify the local properties of the quantum walk, but instead affects spatial coherences. For a visual representation of decoherence phenomena in phase space, we have developed a formalism based on a discrete analogue of the Wigner function. We show that the effects of spin and spatial decoherence differ dramatically in momentum space.