929 resultados para Current systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation develops a process improvement method for service operations based on the Theory of Constraints (TOC), a management philosophy that has been shown to be effective in manufacturing for decreasing WIP and improving throughput. While TOC has enjoyed much attention and success in the manufacturing arena, its application to services in general has been limited. The contribution to industry and knowledge is a method for improving global performance measures based on TOC principles. The method proposed in this dissertation will be tested using discrete event simulation based on the scenario of the service factory of airline turnaround operations. To evaluate the method, a simulation model of aircraft turn operations of a U.S. based carrier was made and validated using actual data from airline operations. The model was then adjusted to reflect an application of the Theory of Constraints for determining how to deploy the scarce resource of ramp workers. The results indicate that, given slight modifications to TOC terminology and the development of a method for constraint identification, the Theory of Constraints can be applied with success to services. Bottlenecks in services must be defined as those processes for which the process rates and amount of work remaining are such that completing the process will not be possible without an increase in the process rate. The bottleneck ratio is used to determine to what degree a process is a constraint. Simulation results also suggest that redefining performance measures to reflect a global business perspective of reducing costs related to specific flights versus the operational local optimum approach of turning all aircraft quickly results in significant savings to the company. Savings to the annual operating costs of the airline were simulated to equal 30% of possible current expenses for misconnecting passengers with a modest increase in utilization of the workers through a more efficient heuristic of deploying them to the highest priority tasks. This dissertation contributes to the literature on service operations by describing a dynamic, adaptive dispatch approach to manage service factory operations similar to airline turnaround operations using the management philosophy of the Theory of Constraints.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mexico harbors more than 10% of the planet’s endemic species. However, the integrity and biodiversity of many ecosystems is experiencing rapid transformation under the influence of a wide array of human and natural disturbances. In order to disentangle the effects of human and natural disturbance regimes at different spatial and temporal scales, we selected six terrestrial (temperate montane forests, montane cloud forests, tropical rain forests, tropical semi-deciduous forests, tropical dry forests, and deserts) and four aquatic (coral reefs, mangrove forests, kelp forests and saline lakes) ecosystems. We used semiquantitative statistical methods to assess (1) the most important agents of disturbance affecting the ecosystems, (2) the vulnerability of each ecosystem to anthropogenic and natural disturbance, and (3) the differences in ecosystem disturbance regimes and their resilience. Our analysis indicates a significant variation in ecological responses, recovery capacity, and resilience among ecosystems. The constant and widespread presence of human impacts on both terrestrial and aquatic ecosystems is reflected either in reduced area coverage for most systems, or reduced productivity and biodiversity, particularly in the case of fragile ecosystems (e.g., rain forests, coral reefs). In all cases, the interaction between historical human impacts and episodic high intensity natural disturbance (e.g., hurricanes, fires) has triggered a reduction in species diversity and induced significant changes in habitat distribution or species dominance. The lack of monitoring programs assessing before/after effects of major disturbances in Mexico is one of the major limitations to quantifying the commonalities and differences of disturbance effects on ecosystem properties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the current age of fast-depleting conventional energy sources, top priority is given to exploring non-conventional energy sources, designing highly efficient energy storage systems and converting existing machines/instruments/devices into energy-efficient ones. ‘Energy efficiency’ is one of the important challenges for today’s scientific and research community, worldwide. In line with this demand, the current research was focused on developing two highly energy-efficient devices – field emitters and Li-ion batteries, using beneficial properties of carbon nanotubes (CNT). Interface-engineered, directly grown CNTs were used as cathode in field emitters, while similar structure was applied as anode in Li-ion batteries. Interface engineering was found to offer minimum resistance to electron flow and strong bonding with the substrate. Both field emitters and Li-ion battery anodes were benefitted from these advantages, demonstrating high energy efficiency. Field emitter, developed during this research, could be characterized by low turn-on field, high emission current, very high field enhancement factor and extremely good stability during long-run. Further, application of 3-dimensional design to these field emitters resulted in achieving one of the highest emission current densities reported so far. The 3-D field emitter registered 27 times increase in current density, as compared to their 2-D counterparts. These achievements were further followed by adding new functionalities, transparency and flexibility, to field emitters, keeping in view of current demand for flexible displays. A CNT-graphene hybrid structure showed appreciable emission, along with very good transparency and flexibility. Li-ion battery anodes, prepared using the interface-engineered CNTs, have offered 140% increment in capacity, as compared to conventional graphite anodes. Further, it has shown very good rate capability and an exceptional ‘zero capacity degradation’ during long cycle operation. Enhanced safety and charge transfer mechanism of this novel anode structure could be explained from structural characterization. In an attempt to progress further, CNTs were coated with ultrathin alumina by atomic layer deposition technique. These alumina-coated CNT anodes offered much higher capacity and an exceptional rate capability, with very low capacity degradation in higher current densities. These highly energy efficient CNT based anodes are expected to enhance capacities of future Li-ion batteries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation presents and evaluates a methodology for scheduling medical application workloads in virtualized computing environments. Such environments are being widely adopted by providers of "cloud computing" services. In the context of provisioning resources for medical applications, such environments allow users to deploy applications on distributed computing resources while keeping their data secure. Furthermore, higher level services that further abstract the infrastructure-related issues can be built on top of such infrastructures. For example, a medical imaging service can allow medical professionals to process their data in the cloud, easing them from the burden of having to deploy and manage these resources themselves. In this work, we focus on issues related to scheduling scientific workloads on virtualized environments. We build upon the knowledge base of traditional parallel job scheduling to address the specific case of medical applications while harnessing the benefits afforded by virtualization technology. To this end, we provide the following contributions: (1) An in-depth analysis of the execution characteristics of the target applications when run in virtualized environments. (2) A performance prediction methodology applicable to the target environment. (3) A scheduling algorithm that harnesses application knowledge and virtualization-related benefits to provide strong scheduling performance and quality of service guarantees. In the process of addressing these pertinent issues for our target user base (i.e. medical professionals and researchers), we provide insight that benefits a large community of scientific application users in industry and academia. Our execution time prediction and scheduling methodologies are implemented and evaluated on a real system running popular scientific applications. We find that we are able to predict the execution time of a number of these applications with an average error of 15%. Our scheduling methodology, which is tested with medical image processing workloads, is compared to that of two baseline scheduling solutions and we find that it outperforms them in terms of both the number of jobs processed and resource utilization by 20–30%, without violating any deadlines. We conclude that our solution is a viable approach to supporting the computational needs of medical users, even if the cloud computing paradigm is not widely adopted in its current form.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This special issue on ‘Science for the management of subtropical embayments: examples from Shark Bay and Florida Bay’ is a valuable compilation of individual research outcomes from Florida Bay and Shark Bay from the past decade and addresses gaps in our scientific knowledge base in Shark Bay especially. Yet the compilation also demonstrates excellent research that is poorly integrated, and driven by interests and issues that do not necessarily lead to a more integrated stewardship of the marine natural values of either Shark Bay or Florida Bay. Here we describe the status of our current knowledge, introduce the valuable extension of the current knowledge through the papers in this issue and then suggest some future directions. For management, there is a need for a multidisciplinary international science program that focusses research on the ecological resilience of Shark Bay and Florida Bay, the effect of interactions between physical environmental drivers and biological control through behavioural and trophic interactions, and all under increased anthropogenic stressors. Shark Bay offers a ‘pristine template’ for this scale of study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Infrastructure systems are drivers of the economy in the nation. A dollar spent on infrastructure development yields roughly double the initial spending in ultimate economic output in the short term; and over a twenty-year period, and generalized ‘public investment’ produces an aggregated $3.21 of economic activity per $1.00 spent [1]. Thus, formulation of policies pertaining to infrastructure investment and development is of significance affecting the social and economic wellbeing of the nation. The aim of this policy brief is to evaluate innovative financing in infrastructure systems from two different perspectives: (1) through consideration of the current condition of infrastructure in the U.S., the current trends in public spending, and the emerging innovative financing tools; (2) through evaluation of the roles and interactions of different agencies in the creation and the diffusion of innovative financing tools. Then using the example of transportation financing, the policy brief provides an assessment of policy landscapes which could lead to the closure of infrastructure financing gap in the U.S and proposes strategies for citizen involvement to gain public support of innovative financing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent studies suggest that coastal ecosystems can bury significantly more C than tropical forests, indicating that continued coastal development and exposure to sea level rise and storms will have global biogeochemical consequences. The Florida Coastal Everglades Long Term Ecological Research (FCE LTER) site provides an excellent subtropical system for examining carbon (C) balance because of its exposure to historical changes in freshwater distribution and sea level rise and its history of significant long-term carbon-cycling studies. FCE LTER scientists used net ecosystem C balance and net ecosystem exchange data to estimate C budgets for riverine mangrove, freshwater marsh, and seagrass meadows, providing insights into the magnitude of C accumulation and lateral aquatic C transport. Rates of net C production in the riverine mangrove forest exceeded those reported for many tropical systems, including terrestrial forests, but there are considerable uncertainties around those estimates due to the high potential for gain and loss of C through aquatic fluxes. C production was approximately balanced between gain and loss in Everglades marshes; however, the contribution of periphyton increases uncertainty in these estimates. Moreover, while the approaches used for these initial estimates were informative, a resolved approach for addressing areas of uncertainty is critically needed for coastal wetland ecosystems. Once resolved, these C balance estimates, in conjunction with an understanding of drivers and key ecosystem feedbacks, can inform cross-system studies of ecosystem response to long-term changes in climate, hydrologic management, and other land use along coastlines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given the role ethnic identity has as a protective factor against the effects of marginalization and discrimination (Umaña-Taylor, 2011), research longitudinally examining ethnic identity has become of increased importance. However, successful identity development must incorporate elements from both one's ethnic group and from the United States (Berry, 1980). Despite this, relatively few studies have jointly evaluated ethnic and American identity (Schwartz et al., 2012). The current dissertation, guided by three objectives, sought to address this and several other gaps in the literature. First, psychometric properties of the Multigroup Ethnic Identity Measure (MEIM) and the American Identity Measure (AIM) were evaluated. Secondly, the dissertation examined growth trends in recently immigrated Hispanic adolescents' and their caregivers' ethnic and American identity. Lastly, the relationship between adolescents' and caregivers' ethnic and American identity was evaluated. The study used an archival sample consisting of 301 recently immigrated Hispanic families collected from Miami (N = 151) and Los Angeles (N = 150). Consistent with previous research, results in Study 1 indicated a two-factor model reliably provided better fit than a one-factor model and established longitudinal invariance for the MEIM and the AIM. Results from Study 2 found significant growth in adolescents' American identity. While some differences were found across site and nationality, evidence suggested recently immigrated Hispanic adolescents were becoming more bicultural. Counterintuitively, results found a significant decline in caregivers' ethnic identity which future studies should further examine. Finally, results from Study 3, found several significant positive relationships between adolescents' and their caregivers' ethnic and American identity. Findings provided preliminary evidence for the importance of examining identity development within a systemic lens. Despite several limitations, these three studies represented a step forward in addressing the current gaps in the cultural identity literature. Implications for future investigation are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Personalized recommender systems aim to assist users in retrieving and accessing interesting items by automatically acquiring user preferences from the historical data and matching items with the preferences. In the last decade, recommendation services have gained great attention due to the problem of information overload. However, despite recent advances of personalization techniques, several critical issues in modern recommender systems have not been well studied. These issues include: (1) understanding the accessing patterns of users (i.e., how to effectively model users' accessing behaviors); (2) understanding the relations between users and other objects (i.e., how to comprehensively assess the complex correlations between users and entities in recommender systems); and (3) understanding the interest change of users (i.e., how to adaptively capture users' preference drift over time). To meet the needs of users in modern recommender systems, it is imperative to provide solutions to address the aforementioned issues and apply the solutions to real-world applications. ^ The major goal of this dissertation is to provide integrated recommendation approaches to tackle the challenges of the current generation of recommender systems. In particular, three user-oriented aspects of recommendation techniques were studied, including understanding accessing patterns, understanding complex relations and understanding temporal dynamics. To this end, we made three research contributions. First, we presented various personalized user profiling algorithms to capture click behaviors of users from both coarse- and fine-grained granularities; second, we proposed graph-based recommendation models to describe the complex correlations in a recommender system; third, we studied temporal recommendation approaches in order to capture the preference changes of users, by considering both long-term and short-term user profiles. In addition, a versatile recommendation framework was proposed, in which the proposed recommendation techniques were seamlessly integrated. Different evaluation criteria were implemented in this framework for evaluating recommendation techniques in real-world recommendation applications. ^ In summary, the frequent changes of user interests and item repository lead to a series of user-centric challenges that are not well addressed in the current generation of recommender systems. My work proposed reasonable solutions to these challenges and provided insights on how to address these challenges using a simple yet effective recommendation framework.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Unified Modeling Language (UML) has quickly become the industry standard for object-oriented software development. It is being widely used in organizations and institutions around the world. However, UML is often found to be too complex for novice systems analysts. Although prior research has identified difficulties novice analysts encounter in learning UML, no viable solution has been proposed to address these difficulties. Sequence-diagram modeling, in particular, has largely been overlooked. The sequence diagram models the behavioral aspects of an object-oriented software system in terms of interactions among its building blocks, i.e. objects and classes. It is one of the most commonly-used UML diagrams in practice. However, there has been little research on sequence-diagram modeling. The current literature scarcely provides effective guidelines for developing a sequence diagram. Such guidelines will be greatly beneficial to novice analysts who, unlike experienced systems analysts, do not possess relevant prior experience to easily learn how to develop a sequence diagram. There is the need for an effective sequence-diagram modeling technique for novices. This dissertation reports a research study that identified novice difficulties in modeling a sequence diagram and proposed a technique called CHOP (CHunking, Ordering, Patterning), which was designed to reduce the cognitive load by addressing the cognitive complexity of sequence-diagram modeling. The CHOP technique was evaluated in a controlled experiment against a technique recommended in a well-known textbook, which was found to be representative of approaches provided in many textbooks as well as practitioner literatures. The results indicated that novice analysts were able to perform better using the CHOP technique. This outcome seems have been enabled by pattern-based heuristics provided by the technique. Meanwhile, novice analysts rated the CHOP technique more useful although not significantly easier to use than the control technique. The study established that the CHOP technique is an effective sequence-diagram modeling technique for novice analysts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Right across Europe technology is playing a vital part in enhancing learning for an increasingly diverse population of learners. Learning is increasingly flexible, social and mobile and supported by high quality multi-media resources. Institutional VLEs are seeing a shift towards open source products and these core systems are supplemented by a range of social and collaborative learning tools based on web 2.0 technologies. Learners undertaking field studies and those in the workplace are coming to expect that these off-campus experiences will also be technology-rich whether supported by institutional or user-owned devices. As well as keeping European businesses competitive, learning is seen as a means of increasing social mobility and supporting an agenda of social justice. For a number of years the EUNIS E-Learning Task Force (ELTF) has conducted snapshot surveys of e-learning across member institutions, collected case studies of good practice in e-learning see (Hayes, et al., 2009) in references, supported a group looking at the future of e-learning, and showcased the best of innovation in its e-learning Award. Now for the first time the ELTF membership has come together to undertake an analysis of developments in the member states and to assess what this might mean for the future. The group applied the techniques of World Café conversation and Scenario Thinking to develop its thoughts. The analysis is unashamedly qualitative and draws on expertise from leading universities across eight of the EUNIS member states. What emerges is interesting in terms of the common trends in developments in all of the nations and similarities in hopes and concerns about the future development of learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review our recent work on the anodization of InP in KOH electrolytes. The anodic oxidation processes are shown to be remarkably different in different concentrations of KOH. Anodization in 2 - 5 mol dm-3 KOH electrolytes results in the formation of porous InP layers but, under similar conditions in a 1 mol dm-3 KOH, no porous structure is evident. Rather, the InP electrode is covered with a thin, compact surface film at lower potentials and, at higher potentials, a highly porous surface film is formed which cracks on drying. Anodization of electrodes in 2 - 5 mol dm-3 KOH results in the formation of porous InP under both potential sweep and constant potential conditions. The porosity is estimated at ~65%. A thin layer (~ 30 nm) close to the surface appears to be unmodified. It is observed that this dense, near-surface layer is penetrated by a low density of pores which appear to connected it to the electrolyte. Well-defined oscillations are observed when InP is anodized in both the KOH and (NH4)2S. The charge per cycle remains constant at 0.32 C cm-2 in (NH4)2S but increases linearly with potential in KOH. Although the characteristics of the oscillations in the two systems differ, both show reproducible and well-behaved values of charge per cycle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.

A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.

Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.

The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).

First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.

Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.

Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.

The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.

To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.

The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.

The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.

Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.

The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.

In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advancements in retinal imaging technologies have drastically improved the quality of eye care in the past couple decades. Scanning laser ophthalmoscopy (SLO) and optical coherence tomography (OCT) are two examples of critical imaging modalities for the diagnosis of retinal pathologies. However current-generation SLO and OCT systems have limitations in diagnostic capability due to the following factors: the use of bulky tabletop systems, monochromatic imaging, and resolution degradation due to ocular aberrations and diffraction.

Bulky tabletop SLO and OCT systems are incapable of imaging patients that are supine, under anesthesia, or otherwise unable to maintain the required posture and fixation. Monochromatic SLO and OCT imaging prevents the identification of various color-specific diagnostic markers visible with color fundus photography like those of neovascular age-related macular degeneration. Resolution degradation due to ocular aberrations and diffraction has prevented the imaging of photoreceptors close to the fovea without the use of adaptive optics (AO), which require bulky and expensive components that limit the potential for widespread clinical use.

In this dissertation, techniques for extending the diagnostic capability of SLO and OCT systems are developed. These techniques include design strategies for miniaturizing and combining SLO and OCT to permit multi-modal, lightweight handheld probes to extend high quality retinal imaging to pediatric eye care. In addition, a method for extending true color retinal imaging to SLO to enable high-contrast, depth-resolved, high-fidelity color fundus imaging is demonstrated using a supercontinuum light source. Finally, the development and combination of SLO with a super-resolution confocal microscopy technique known as optical photon reassignment (OPRA) is demonstrated to enable high-resolution imaging of retinal photoreceptors without the use of adaptive optics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over 50% of the world's population live within 3. km of rivers and lakes highlighting the on-going importance of freshwater resources to human health and societal well-being. Whilst covering c. 3.5% of the Earth's non-glaciated land mass, trends in the environmental quality of the world's standing waters (natural lakes and reservoirs) are poorly understood, at least in comparison with rivers, and so evaluation of their current condition and sensitivity to change are global priorities. Here it is argued that a geospatial approach harnessing existing global datasets, along with new generation remote sensing products, offers the basis to characterise trajectories of change in lake properties e.g., water quality, physical structure, hydrological regime and ecological behaviour. This approach furthermore provides the evidence base to understand the relative importance of climatic forcing and/or changing catchment processes, e.g. land cover and soil moisture data, which coupled with climate data provide the basis to model regional water balance and runoff estimates over time. Using examples derived primarily from the Danube Basin but also other parts of the World, we demonstrate the power of the approach and its utility to assess the sensitivity of lake systems to environmental change, and hence better manage these key resources in the future.