981 resultados para Modern Art
Resumo:
Date of Acceptance: 13/04/2015
Resumo:
Date of Acceptance: 13/04/2015
Resumo:
X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.
A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.
Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.
The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).
First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.
Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.
Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.
The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.
To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.
The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.
The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.
Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.
The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.
In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.
Resumo:
This thesis is the first sustained assessment of Elizabeth Bowen’s writing from a visual perspective. By first compiling a visual biography of the author, I argue that Bowen’s responsiveness to art, her relationships with artists, and her knowledge of modern and traditional aesthetics are formative influences on her work. Investigating her assertion that she was a “visual writer,” my discussion develops into an examination of her technique of “verbal painting” through which she reinvents traditional visual modes as a personal modernist idiom. Close textual analysis of Bowen’s fictions forms the dominant methodology of this thesis and facilitates my delineation of her engagement with the Futurist and Surrealist aesthetics in addition to broader aspects of her visuality, including her treatment of the “vividly visual” dream-state to the distinct ocularcentricity of her writing. Ultimately, this thesis seeks to advance our knowledge of Bowen’s visual method and to offer a new approach in which to nuance our understanding of her modernism
Resumo:
From Leighton to Lucas this fascinating book explores sculpture in Britain from the end of the Victorian era to the dawn of the new millennium. With incisive essays, and previously unpublished archive material, this compelling book documents the seismic shifts in sculpture in the last century. Edited by Penelope Curtis (former Curator of the Henry Moore Institute, now Director of the Tate Britain) and Keith Wilson (Sculptor, tutor at the Royal College of Art and Reader in Fine Art at University of Westminster).
Resumo:
In Edo-Japan (c.1603 – 1868) shunga, sexually explicit prints, paintings and illustrated books, were widely produced and disseminated. However, from the 1850s onwards, shunga was suppressed by the government and it has largely been omitted from art history, excluded from exhibitions and censored in publications. Although changes have taken place, cultural institutions continue to be cautious about what they collect and exhibit, with shunga largely remaining a prohibited subject in Japan. Since the 1970s there has been a gradual increase in the acceptance of shunga outside Japan, as evidenced in the growing number of exhibitions and publications. The initial impetus behind this thesis was: Why and how did shunga become increasingly acceptable in Europe and North America in the twentieth century, whilst conversely becoming unacceptable in post-Edo Japan? I discuss how and why attitudes to shunga in the UK and Japan have changed from the Edo period to the present day, and consider how definitions can affect this. My research examines how shunga has been dealt with in relation to private and institutional collecting and exhibitions. In order to gauge modern responses, the 2013 Shunga: Sex and Pleasure in Japanese Art exhibition at the British Museum is used as an in-depth study – utilising mixed methods and an interdisciplinary approach to analyse curatorial and legal decisions, as well as visitor feedback. To-date there are no official or standardised guidelines for the acquisition, cataloguing, or display of sexually explicit artefacts. It is intended that institutions will benefit from my analysis of the changing perceptions of shunga and of previous shunga collections and exhibitions when dealing with shunga or other sexually explicit items in the future.
Resumo:
The nineteenth-century Romantic era saw the development and expansion of many vocal and instrumental forms that had originated in the Classical era. In particular, the German lied and French mélodie matured as art forms, and they found a kind of equilibrium between piano and vocal lines. Similarly, the nineteenth-century piano quartet came into its own as a form of true chamber music in which all instruments participated equally in the texture. Composers such as Robert Schumann, Johannes Brahms, and Gabriel Fauré offer particularly successful examples of both art song and piano quartets that represent these genres at their highest level of artistic complexity. Their works have become the cornerstones of the modern collaborative pianist’s repertoire. My dissertation explored both the art songs and the piano quartets of these three composers and studied the different skills needed by a pianist performing both types of works. This project included the following art song cycles: Robert Schumann’s Dichterliebe, Gabriel Fauré’s Poème d’un Jour, and Johannes Brahms’ Zigeunerlieder. I also performed Schumann’s Piano Quartet in E-flat Major, Op. 47, Fauré’s Piano Quartet in C minor, Op. 15, and Brahms’ Piano Quartet in G minor, Op. 25. My collaborators included: Zachariah Matteson, violin and viola; Kristin Bakkegard, violin; Molly Jones, cello; Geoffrey Manyin, cello; Karl Mitze, viola; Emily Riggs, soprano, and Matthew Hill, tenor. This repertoire was presented over the course of three recitals on February 13, 2015, December 11, 2015, March 25, 2016 at the University of Maryland’s Gildenhorn Recital Hall. These recitals can be found in the Digital Repository at the University of Maryland (DRUM).
Resumo:
Mode of access: Internet.
Resumo:
This thesis investigates how ways of being in different ontologies emerge from material and embodied practice. This general concern is explored through the particular case study of Scotland in the period of the witch trials (the 16th and 17th centuries C.E.). The field of early modern Scottish witchcraft studies has been active and dynamic over the past 15 years but its prioritisation of what people said over what they did leaves a clear gap for a situated and relational approach focusing upon materiality. Such an approach requires a move away from the Cartesian dichotomies of modern ontology to recognise past beliefs as real to those who experienced them, coconstitutive of embodiment and of the material worlds people inhabited. In theory, method and practice, this demands a different way of exploring past worlds to avoid flattening strange data. To this end, the study incorporates narratives and ‘disruptions’ – unique engagements with Contemporary Art which facilitate understanding by enabling the temporary suspension of disbelief. The methodology is iterative, tacking between material and written sources in order to better understand the heterogeneous assemblages of early modern (counter-) witchcraft. Previously separate areas of discourse are (re-)constituted into alternative ontic categories of newly-parallel materials. New interpretations of things, places, bodies and personhoods emerge, raising questions about early modern experiences of the world. Three thematic chapters explore different sets of collaborative agencies as they entwine into new things, co-fabricating a very different world. Moving between witch trial accounts, healing wells, infant burial grounds, animals, discipline artefacts and charms, the boundaries of all prove highly permeable. People, cloth and place bleed into one another through contact; trees and water emerge as powerful agents of magical-place-making; and people and animals meet to become single, hybrid-persons spread over two bodies. Life and death consistently emerge as protracted processes with the capacity to overlap and occur simultaneously in problematic ways. The research presented in this thesis establishes a new way of looking at the nature of Being as experienced by early modern Scots. This provides a foundation for further studies, which can draw in other materials not explored here such as communion wares and metal charms. Comparison with other early modern Western societies may also prove fruitful. Furthermore, the methodology may be suitable for application to other interdisciplinary projects incorporating historical and material evidence.
Resumo:
A High-Performance Computing job dispatcher is a critical software that assigns the finite computing resources to submitted jobs. This resource assignment over time is known as the on-line job dispatching problem in HPC systems. The fact the problem is on-line means that solutions must be computed in real-time, and their required time cannot exceed some threshold to do not affect the normal system functioning. In addition, a job dispatcher must deal with a lot of uncertainty: submission times, the number of requested resources, and duration of jobs. Heuristic-based techniques have been broadly used in HPC systems, at the cost of achieving (sub-)optimal solutions in a short time. However, the scheduling and resource allocation components are separated, thus generates a decoupled decision that may cause a performance loss. Optimization-based techniques are less used for this problem, although they can significantly improve the performance of HPC systems at the expense of higher computation time. Nowadays, HPC systems are being used for modern applications, such as big data analytics and predictive model building, that employ, in general, many short jobs. However, this information is unknown at dispatching time, and job dispatchers need to process large numbers of them quickly while ensuring high Quality-of-Service (QoS) levels. Constraint Programming (CP) has been shown to be an effective approach to tackle job dispatching problems. However, state-of-the-art CP-based job dispatchers are unable to satisfy the challenges of on-line dispatching, such as generate dispatching decisions in a brief period and integrate current and past information of the housing system. Given the previous reasons, we propose CP-based dispatchers that are more suitable for HPC systems running modern applications, generating on-line dispatching decisions in a proper time and are able to make effective use of job duration predictions to improve QoS levels, especially for workloads dominated by short jobs.
Resumo:
Cultural heritage is constituted by complex and heterogenous materials, such as paintings but also ancient remains. However, all ancient materials are exposed to external environment and their interaction produces different changes due to chemical, physical and biological phenomena. The organic fraction, especially the proteinaceous one, has a crucial role in all these materials: in archaeology proteins reveal human habits, in artworks they disclose technics and help for a correct restoration. For these reasons the development of methods that allow the preservation of the sample as much as possible and a deeper knowledge of the deterioration processes is fundamental. The research activities presented in this PhD thesis have been focused on the development of new immunochemical and spectroscopic approaches in order to detect and identify organic substances in artistic and archaeological samples. Organic components could be present in different cultural heritage materials as constituent element (e.g., binders in paintings, collagen in bones) and their knowledge is fundamental for a complete understanding of past life, degradation processes and appropriate restauration approaches. The combination of immunological approach with a chemiluminescence detection and Laser Ablation-Inductively Coupled Plasma-Mass Spectrometry allowed a sensitive and selective localization of collagen and elements in ancient bones and teeth. Near-infrared spectrometer and hyper spectral imaging have been applied in combination with chemometric data analysis as non-destructive methods for bones prescreening for the localization of collagen. Moreover, an investigation of amino acids in enamel has been proposed, in order to clarify teeth biomolecules survival overtime through the optimization and application of High-Performance Liquid Chromatography on modern and ancient enamel powder. New portable biosensors were developed for ovalbumin identification in paintings, thanks to the combination between biocompatible Gellan gel and electro-immunochemical sensors, to extract and identify painting binders with the contact only between gel and painting and between gel and electrodes.
Resumo:
In modern society, security issues of IT Systems are intertwined with interdisciplinary aspects, from social life to sustainability, and threats endanger many aspects of every- one’s daily life. To address the problem, it’s important that the systems that we use guarantee a certain degree of security, but to achieve this, it is necessary to be able to give a measure to the amount of security. Measuring security is not an easy task, but many initiatives, including European regulations, want to make this possible. One method of measuring security is based on the use of security metrics: those are a way of assessing, from various aspects, vulnera- bilities, methods of defense, risks and impacts of successful attacks then also efficacy of reactions, giving precise results using mathematical and statistical techniques. I have done literature research to provide an overview on the meaning, the effects, the problems, the applications and the overall current situation over security metrics, with particular emphasis in giving practical examples. This thesis starts with a summary of the state of the art in the field of security met- rics and application examples to outline the gaps in current literature, the difficulties found in the change of application context, to then advance research questions aimed at fostering the discussion towards the definition of a more complete and applicable view of the subject. Finally, it stresses the lack of security metrics that consider interdisciplinary aspects, giving some potential starting point to develop security metrics that cover all as- pects involved, taking the field to a new level of formal soundness and practical usability.
Resumo:
Growth in the development and production of engineered nanoparticles (ENPs) in recent years has increased the potential for interactions of these nanomaterials with aquatic and terrestrial environments. Carefully designed studies are therefore required in order to understand the fate, transport, stability, and toxicity of nanoparticles. Natural organic matter (NOM), such as the humic substances found in water, sediment, and soil, is one of the substances capable of interacting with ENPs. This review presents the findings of studies of the interaction of ENPs and NOM, and the possible effects on nanoparticle stability and the toxicity of these materials in the environment. In addition, ENPs and NOM are utilized for many different purposes, including the removal of metals and organic compounds from effluents, and the development of new electronic sensors and other devices for the detection of active substances. Discussion is therefore provided of some of the ways in which NOM can be used in the production of nanoparticles. Although there has been an increase in the number of studies in this area, further progress is needed to improve understanding of the dynamic interactions between ENPs and NOM.
Resumo:
This paper discusses the claim of the situatedness of research in both theoretical and applied linguistics and some of its implications and argues that it is linked to the performativity of all assertions, including scientific ones. More importantly, I argue that it is the regressive infinity of performativity that makes inevitable the passage from presumably 'dispassionate' research to militancy.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física