998 resultados para Java Advanced Imaging


Relevância:

30.00% 30.00%

Publicador:

Resumo:

After decades of development in programming languages and programming environments, Smalltalk is still one of few environments that provide advanced features and is still widely used in the industry. However, as Java became prevalent, the ability to call Java code from Smalltalk and vice versa becomes important. Traditional approaches to integrate the Java and Smalltalk languages are through low-level communication between separate Java and Smalltalk virtual machines. We are not aware of any attempt to execute and integrate the Java language directly in the Smalltalk environment. A direct integration allows for very tight and almost seamless integration of the languages and their objects within a single environment. Yet integration and language interoperability impose challenging issues related to method naming conventions, method overloading, exception handling and thread-locking mechanisms. In this paper we describe ways to overcome these challenges and to integrate Java into the Smalltalk environment. Using techniques described in this paper, the programmer can call Java code from Smalltalk using standard Smalltalk idioms while the semantics of each language remains preserved. We present STX:LIBJAVA - an implementation of Java virtual machine within Smalltalk/X - as a validation of our approach

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: An optimized, longitudinal in vivo magnetic resonance vessel wall-imaging protocol was evaluated regarding its capability of detecting differences in the time-dependent atherosclerotic lesion progression in the aortic arch between ApoE(-/-) and double-deficient ApoE(-/-)/TNF(-/-) mice at comparatively early plaque development stages. MATERIALS AND METHODS: Seven ApoE(-/-) and seven ApoE(-/-)/TNF(-/-) female mice underwent MRI at 11.75 teslas at four stages up to 26 weeks of age. A double-gated spin-echo MRI sequence was used with careful perpendicular slice positioning to visualize the vessel wall of the ascending aortic arch. RESULTS: Wall-thickness progression measured with MRI was significant at 11 weeks of age in ApoE(-/-) mice, but only at 26 weeks in ApoE(-/-)/TNF(-/-) mice. A significant correlation was found between MRI wall-thickness and lesion area determined on histology. CONCLUSION: MRI was shown to be sensitive enough to reveal subtle genetically-induced differences in lesion progression at ages earlier than 25 weeks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of advanced materials aimed at improving human life has been performed since time immemorial. Such studies have created everlasting and greatly revered monuments and have helped revolutionize transportation by ushering the age of lighter–than–air flying machines. Hence a study of the mechanical behavior of advanced materials can pave way for their use for mankind’s benefit. In this school of thought, the aim of this dissertation is to broadly perform two investigations. First, an efficient modeling approach is established to predict the elastic response of cellular materials with distributions of cell geometries. Cellular materials find important applications in structural engineering. The approach does not require complex and time-consuming computational techniques usually associated with modeling such materials. Unlike most current analytical techniques, the modeling approach directly accounts for the cellular material microstructure. The approach combines micropolar elasticity theory and elastic mixture theory to predict the elastic response of cellular materials. The modeling approach is applied to the two dimensional balsa wood material. Predicted properties are in good agreement with experimentally determined properties, which emphasizes the model’s potential to predict the elastic response of other cellular solids, such as open cell and closed cell foams. The second topic concerns intraneural ganglion cysts which are a set of medical conditions that result in denervation of the muscles innervated by the cystic nerve leading to pain and loss of function. Current treatment approaches only temporarily alleviate pain and denervation which, however, does not prevent cyst recurrence. Hence, a mechanistic understanding of the pathogenesis of intraneural ganglion cysts can help clinicians understand them better and therefore devise more effective treatment options. In this study, an analysis methodology using finite element analysis is established to investigate the pathogenesis of intraneural ganglion cysts. Using this methodology, the propagation of these cysts is analyzed in their most common site of occurrence in the human body i.e. the common peroneal nerve. Results obtained using finite element analysis show good correlation with clinical imaging patterns thereby validating the promise of the method to study cyst pathogenesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) has been used to quantify SO2 emissions from passively degassing volcanoes. This dissertation explores ASTER’s capability to detect SO2 with satellite validation, enhancement techniques and extensive processing of images at a variety of volcanoes. ASTER is compared to the Mini UV Spectrometer (MUSe), a ground based instrument, to determine if reasonable SO2 fluxes can be quantified from a plume emitted from Lascar, Chile. The two sensors were in good agreement with ASTER proving to be a reliable detector of SO2. ASTER illustrated the advantages of imaging a plume in 2D, with better temporal resolution than the MUSe. SO2 plumes in ASTER imagery are not always discernible in the raw TIR data. Principal Component Analysis (PCA) and Decorrelation Stretch (DCS) enhancement techniques were compared to determine how well they highlight a variety of volcanic plumes. DCS produced a consistent output and the composition of the plumes was easy to identify from explosive eruptions. As the plumes became smaller and lower in altitude they became harder to distinguish using DCS. PCA proved to be better at identifying smaller low altitude plumes. ASTER was used to investigate SO2 emissions at Lascar, Chile. Activity at Lascar has been characterized by cyclic behavior and persistent degassing (Matthews et al. 1997). Previous studies at Lascar have primarily focused on changes in thermal infrared anomalies, neglecting gas emissions. Using the SO2 data along with changes in thermal anomalies and visual observations it is evident that Lascar is at the end an eruptive cycle that began in 1993. Declining gas emissions and crater temperatures suggest that the conduit is sealing. ASTER and the Ozone Monitoring Instrument (OMI) were used to determine the annual contribution of SO2 to the troposphere from the Central and South American volcanic arcs between 2000 and 2011. Fluxes of 3.4 Tg/a for Central America and 3.7 Tg/a for South America were calculated. The detection limits of ASTER were explored. The results a proved to be interesting, with plumes from many of the high emitting volcanoes, such as Villarrica, Chile, not being detected by ASTER.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Direct imaging of extra-solar planets in the visible and infrared region has generated great interest among scientists and the general public as well. However, this is a challenging problem. Diffculties of detecting a planet (faint source) are caused, mostly, by two factors: sidelobes caused by starlight diffraction from the edge of the pupil and the randomly scattered starlight caused by the phase errors from the imperfections in the optical system. While the latter diffculty can be corrected by high density active deformable mirrors with advanced phase sensing and control technology, the optimized strategy for suppressing the diffraction sidelobes is still an open question. In this thesis, I present a new approach to the sidelobe reduction problem: pupil phase apodization. It is based on a discovery that an anti-symmetric spatial phase modulation pattern imposed over a pupil or a relay plane causes diffracted starlight suppression sufficient for imaging of extra-solar planets. Numerical simulations with specific square pupil (side D) phase functions, such as ... demonstrate annulling in at least one quadrant of the diffraction plane to the contrast level of better than 10^12 with an inner working angle down to 3.5L/D (with a = 3 and e = 10^3). Furthermore, our computer experiments show that phase apodization remains effective throughout a broad spectrum (60% of the central wavelength) covering the entire visible light range. In addition to the specific phase functions that can yield deep sidelobe reduction on one quadrant, we also found that a modified Gerchberg-Saxton algorithm can help to find small sized (101 x 101 element) discrete phase functions if regional sidelobe reduction is desired. Our simulation shows that a 101x101 segmented but gapless active mirror can also generate a dark region with Inner Working Distance about 2.8L/D in one quadrant. Phase-only modulation has the additional appeal of potential implementation via active segmented or deformable mirrors, thereby combining compensation of random phase aberrations and diffraction halo removal in a single optical element.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Postmortem imaging is increasingly used in forensic practice in cases of natural deaths related to cardiovascular diseases, which represent the most common causes of death in developed countries. While radiological examination is generally considered to be a good complement for conventional autopsy, it was thought to have limited application in cardiovascular pathology. At present, multidetector computed tomography (MDCT), CT angiography, and cardiac magnetic resonance imaging (MRI) are used in postmortem radiological investigation of cardiovascular pathologies. This review presents the actual state of postmortem imaging for cardiovascular pathologies in cases of sudden cardiac death (SCD), taking into consideration both the advantages and limitations. The radiological evaluation of ischemic heart disease (IHD), the most frequent cause of SCD in the General population of industrialized countries, includes the examination of the coronary arteries and myocardium. Postmortem CT angiography (PMCTA) is very useful for the detection of stenoses and occlusions of coronary arteries but less so for the identification of ischemic myocardium. MRI is the method of choice for the radiological investigation of the myocardium in clinical practice, but ist accessibility and application are still limited in postmortem practice. There are very few reports implicating postmortem radiology in the investigation of other causes of SCD, such as cardiomyopathies, coronary artery abnormalities, and valvular pathologies. Cardiomyopathies representing the most frequent cause of SCD in young athletes cannot be diagnosed by echocardiography, the most widely available technique in clinical practice for the functional evaluation of the heart and the detection of cardiomyopathies. PMCTA and MRI have the potential to detect advanced stages of diseases when morphological substrate is present, but these methods have yet to be sufficiently validated for postmortem cases. Genetically determined channelopathies cannot be detected radiologically. This review underlines the need to establish the role of postmortem radiology in the diagnosis of SCD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Given the fragmentation of outpatient care, timely follow-up of abnormal diagnostic imaging results remains a challenge. We hypothesized that an electronic medical record (EMR) that facilitates the transmission and availability of critical imaging results through either automated notification (alerting) or direct access to the primary report would eliminate this problem. METHODS: We studied critical imaging alert notifications in the outpatient setting of a tertiary care Department of Veterans Affairs facility from November 2007 to June 2008. Tracking software determined whether the alert was acknowledged (ie, health care practitioner/provider [HCP] opened the message for viewing) within 2 weeks of transmission; acknowledged alerts were considered read. We reviewed medical records and contacted HCPs to determine timely follow-up actions (eg, ordering a follow-up test or consultation) within 4 weeks of transmission. Multivariable logistic regression models accounting for clustering effect by HCPs analyzed predictors for 2 outcomes: lack of acknowledgment and lack of timely follow-up. RESULTS: Of 123 638 studies (including radiographs, computed tomographic scans, ultrasonograms, magnetic resonance images, and mammograms), 1196 images (0.97%) generated alerts; 217 (18.1%) of these were unacknowledged. Alerts had a higher risk of being unacknowledged when the ordering HCPs were trainees (odds ratio [OR], 5.58; 95% confidence interval [CI], 2.86-10.89) and when dual-alert (>1 HCP alerted) as opposed to single-alert communication was used (OR, 2.02; 95% CI, 1.22-3.36). Timely follow-up was lacking in 92 (7.7% of all alerts) and was similar for acknowledged and unacknowledged alerts (7.3% vs 9.7%; P = .22). Risk for lack of timely follow-up was higher with dual-alert communication (OR, 1.99; 95% CI, 1.06-3.48) but lower when additional verbal communication was used by the radiologist (OR, 0.12; 95% CI, 0.04-0.38). Nearly all abnormal results lacking timely follow-up at 4 weeks were eventually found to have measurable clinical impact in terms of further diagnostic testing or treatment. CONCLUSIONS: Critical imaging results may not receive timely follow-up actions even when HCPs receive and read results in an advanced, integrated electronic medical record system. A multidisciplinary approach is needed to improve patient safety in this area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Magnetic resonance imaging, with its exquisite soft tissue contrast, is an ideal modality for investigating spinal cord pathology. While conventional MRI techniques are very sensitive for spinal cord pathology, their specificity is somewhat limited. Diffusion MRI is an advanced technique which is a very sensitive and specific indicator of the integrity of white matter tracts. Diffusion imaging has been shown to detect early ischemic changes in white matter, while conventional imaging demonstrates no change. By acquiring the complete apparent diffusion tensor (ADT), tissue diffusion properties can be expressed in terms of quantitative and rotationally invariant parameters. ^ Systematic study of SCI in vivo requires controlled animal models such as the popular rat model. To date, studies of spinal cord using ADT imaging have been performed exclusively in fixed, excised spinal cords, introducing inevitable artifacts and losing the benefits of MRI's noninvasive nature. In vivo imaging reflects the actual in vivo tissue properties, and allows each animal to be imaged at multiple time points, greatly reducing the number of animals required to achieve statistical significance. Because the spinal cord is very small, the available signal-to-noise ratio (SNR) is very low. Prior spin-echo based ADT studies of rat spinal cord have relied on high magnetic field strengths and long imaging times—on the order of 10 hours—for adequate SNR. Such long imaging times are incompatible with in vivo imaging, and are not relevant for imaging the early phases following SCI. Echo planar imaging (EPI) is one of the fastest imaging methods, and is popular for diffusion imaging. However, EPI further lowers the image SNR, and is very sensitive to small imperfections in the magnetic field, such as those introduced by the bony spine. Additionally, The small field-of-view (FOV) needed for spinal cord imaging requires large imaging gradients which generate EPI artifacts. The addition of diffusion gradients introduces yet further artifacts. ^ This work develops a method for rapid EPI-based in vivo diffusion imaging of rat spinal cord. The method involves improving the SNR using an implantable coil; reducing magnetic field inhomogeneities by means of an autoshim, and correcting EPI artifacts by post-processing. New EPI artifacts due to diffusion gradients described, and post-processing correction techniques are developed. ^ These techniques were used to obtain rotationally invariant diffusion parameters from 9 animals in vivo, and were validated using the gold-standard, but slow, spinecho based diffusion sequence. These are the first reported measurements of the ADT in spinal cord in vivo . ^ Many of the techniques described are equally applicable toward imaging of human spinal cord. We anticipate that these techniques will aid in evaluating and optimizing potential therapies, and will lead to improved patient care. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Derivation of probability estimates complementary to geophysical data sets has gained special attention over the last years. Information about a confidence level of provided physical quantities is required to construct an error budget of higher-level products and to correctly interpret final results of a particular analysis. Regarding the generation of products based on satellite data a common input consists of a cloud mask which allows discrimination between surface and cloud signals. Further the surface information is divided between snow and snow-free components. At any step of this discrimination process a misclassification in a cloud/snow mask propagates to higher-level products and may alter their usability. Within this scope a novel probabilistic cloud mask (PCM) algorithm suited for the 1 km × 1 km Advanced Very High Resolution Radiometer (AVHRR) data is proposed which provides three types of probability estimates between: cloudy/clear-sky, cloudy/snow and clear-sky/snow conditions. As opposed to the majority of available techniques which are usually based on the decision-tree approach in the PCM algorithm all spectral, angular and ancillary information is used in a single step to retrieve probability estimates from the precomputed look-up tables (LUTs). Moreover, the issue of derivation of a single threshold value for a spectral test was overcome by the concept of multidimensional information space which is divided into small bins by an extensive set of intervals. The discrimination between snow and ice clouds and detection of broken, thin clouds was enhanced by means of the invariant coordinate system (ICS) transformation. The study area covers a wide range of environmental conditions spanning from Iceland through central Europe to northern parts of Africa which exhibit diverse difficulties for cloud/snow masking algorithms. The retrieved PCM cloud classification was compared to the Polar Platform System (PPS) version 2012 and Moderate Resolution Imaging Spectroradiometer (MODIS) collection 6 cloud masks, SYNOP (surface synoptic observations) weather reports, Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) vertical feature mask version 3 and to MODIS collection 5 snow mask. The outcomes of conducted analyses proved fine detection skills of the PCM method with results comparable to or better than the reference PPS algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lake water temperature (LWT) is an important driver of lake ecosystems and it has been identified as an indicator of climate change. Consequently, the Global Climate Observing System (GCOS) lists LWT as an essential climate variable. Although for some European lakes long in situ time series of LWT do exist, many lakes are not observed or only on a non-regular basis making these observations insufficient for climate monitoring. Satellite data can provide the information needed. However, only few satellite sensors offer the possibility to analyse time series which cover 25 years or more. The Advanced Very High Resolution Radiometer (AVHRR) is among these and has been flown as a heritage instrument for almost 35 years. It will be carried on for at least ten more years, offering a unique opportunity for satellite-based climate studies. Herein we present a satellite-based lake surface water temperature (LSWT) data set for European water bodies in or near the Alps based on the extensive AVHRR 1 km data record (1989–2013) of the Remote Sensing Research Group at the University of Bern. It has been compiled out of AVHRR/2 (NOAA-07, -09, -11, -14) and AVHRR/3 (NOAA-16, -17, -18, -19 and MetOp-A) data. The high accuracy needed for climate related studies requires careful pre-processing and consideration of the atmospheric state. The LSWT retrieval is based on a simulation-based scheme making use of the Radiative Transfer for TOVS (RTTOV) Version 10 together with ERA-interim reanalysis data from the European Centre for Medium-range Weather Forecasts. The resulting LSWTs were extensively compared with in situ measurements from lakes with various sizes between 14 and 580 km2 and the resulting biases and RMSEs were found to be within the range of −0.5 to 0.6 K and 1.0 to 1.6 K, respectively. The upper limits of the reported errors could be rather attributed to uncertainties in the data comparison between in situ and satellite observations than inaccuracies of the satellite retrieval. An inter-comparison with the standard Moderate-resolution Imaging Spectroradiometer (MODIS) Land Surface Temperature product exhibits RMSEs and biases in the range of 0.6 to 0.9 and −0.5 to 0.2 K, respectively. The cross-platform consistency of the retrieval was found to be within ~ 0.3 K. For one lake, the satellite-derived trend was compared with the trend of in situ measurements and both were found to be similar. Thus, orbital drift is not causing artificial temperature trends in the data set. A comparison with LSWT derived through global sea surface temperature (SST) algorithms shows lower RMSEs and biases for the simulation-based approach. A running project will apply the developed method to retrieve LSWT for all of Europe to derive the climate signal of the last 30 years. The data are available at doi:10.1594/PANGAEA.831007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PurposeTo assess clinical outcomes and patterns of loco-regional failure (LRF) in relation to clinical target volumes (CTV) in patients with locally advanced hypopharyngeal and laryngeal squamous cell carcinoma (HL-SCC) treated with definitive intensity modulated radiotherapy (IMRT) and concurrent systemic therapy.MethodsData from HL-SCC patients treated from 2007 to 2010 were retrospectively evaluated. Primary endpoint was loco-regional control (LRC). Secondary endpoints included local (LC) and regional (RC) controls, distant metastasis free survival (DMFS), laryngectomy free survival (LFS), overall survival (OS), and acute and late toxicities. Time-to-event endpoints were estimated using Kaplan-Meier method, and univariate and multivariate analyses were performed using Cox proportional hazards models. Recurrent gross tumor volume (RTV) on post-treatment diagnostic imaging was analyzed in relation to corresponding CTV (in-volume, > 95% of RTV inside CTV; marginal, 20¿95% inside CTV; out-volume, < 20% inside CTV).ResultsFifty patients (stage III: 14, IVa: 33, IVb: 3) completed treatment and were included in the analysis (median follow-up of 4.2 years). Three-year LRC, DMFS and overall survival (OS) were 77%, 96% and 63%, respectively. Grade 2 and 3 acute toxicity were 38% and 62%, respectively; grade 2 and 3 late toxicity were 23% and 15%, respectively. We identified 10 patients with LRF (8 local, 1 regional, 1 local¿+¿regional). Six out of 10 RTVs were fully included in both elective and high-dose CTVs, and 4 RTVs were marginal to the high-dose CTVs.ConclusionThe treatment of locally advanced HL-SCC with definitive IMRT and concurrent systemic therapy provides good LRC rates with acceptable toxicity profile. Nevertheless, the analysis of LRFs in relation to CTVs showed in-volume relapses to be the major mode of recurrence indicating that novel strategies to overcome radioresistance are required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Bioresorbable scaffolds provide transient lumen support followed by complete resorption. OBJECTIVES This study examined whether very late scaffold thrombosis (VLScT) occurs when resorption is presumed to be nearly complete. METHODS Patients with VLScT at 3 tertiary care centers underwent thrombus aspiration followed by optical coherence tomography (OCT). Thrombus aspirates were analyzed by histopathological and spectroscopic examination. RESULTS Between March 2014 and February 2015, 4 patients presented with VLScT at 44 (case 1), 19 (cases 2 and 4), and 21 (case 3) months, respectively, after implantation of an Absorb Bioresorbable Vascular Scaffold 1.1 (Abbott Laboratories, Abbott Park, Illinois). At the time of VLScT, all patients were taking low-dose aspirin, and 2 patients were also taking prasugrel. OCT showed malapposed scaffold struts surrounded by thrombus in 7.1%, 9.0%, and 8.9% of struts in cases 1, 2, and 4, respectively. Scaffold discontinuity with struts in the lumen center was the cause of malapposition in cases 2 and 4. Uncovered scaffold struts with superimposed thrombus were the predominant findings in case 3. OCT percent area stenosis at the time of VLScT was high in case 1 (74.8%) and case 2 (70.9%) without evidence of excessive neointimal hyperplasia. Spectroscopic thrombus aspirate analysis showed persistence of intracoronary polymer fragments in case 1. CONCLUSIONS VLScT may occur at advanced stages of scaffold resorption. Potential mechanisms specific for VLScT include scaffold discontinuity and restenosis during the resorption process, which appear delayed in humans; these findings suggest an extended period of vulnerability for thrombotic events.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The binding of immune inhibitory receptor Programmed Death 1 (PD-1) on T cells to its ligand PD-L1 has been implicated as a major contributor to tumor induced immune suppression. Clinical trials of PD-L1 blockade have proven effective in unleashing therapeutic anti-tumor immune responses in a subset of patients with advanced melanoma, yet current response rates are low for reasons that remain unclear. Hypothesizing that the PD-1/PD-L1 pathway regulates T cell surveillance within the tumor microenvironment, we employed intravital microscopy to investigate the in vivo impact of PD-L1 blocking antibody upon tumor-associated immune cell migration. However, current analytical methods of intravital dynamic microscopy data lack the ability to identify cellular targets of T cell interactions in vivo, a crucial means for discovering which interactions are modulated by therapeutic intervention. By developing novel imaging techniques that allowed us to better analyze tumor progression and T cell dynamics in the microenvironment; we were able to explore the impact of PD-L1 blockade upon the migratory properties of tumor-associated immune cells, including T cells and antigen presenting cells, in lung tumor progression. Our results demonstrate that early changes in tumor morphology may be indicative of responsiveness to anti-PD-L1 therapy. We show that immune cells in the tumor microenvironment as well as tumors themselves express PD-L1, but immune phenotype alone is not a predictive marker of effective anti-tumor responses. Through a novel method in which we quantify T cell interactions, we show that T cells are largely engaged in interactions with dendritic cells in the tumor microenvironment. Additionally, we show that during PD-L1 blockade, non-activated T cells are recruited in greater numbers into the tumor microenvironment and engage more preferentially with dendritic cells. We further show that during PD-L1 blockade, activated T cells engage in more confined, immune synapse-like interactions with dendritic cells, as opposed to more dynamic, kinapse-like interactions with dendritic cells when PD-L1 is free to bind its receptor. By advancing the contextual analysis of anti-tumor immune surveillance in vivo, this study implicates the interaction between T cells and tumor-associated dendritic cells as a possible modulator in targeting PD-L1 for anti-tumor immunotherapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El Análisis de Consumo de Recursos o Análisis de Coste trata de aproximar el coste de ejecutar un programa como una función dependiente de sus datos de entrada. A pesar de que existen trabajos previos a esta tesis doctoral que desarrollan potentes marcos para el análisis de coste de programas orientados a objetos, algunos aspectos avanzados, como la eficiencia, la precisión y la fiabilidad de los resultados, todavía deben ser estudiados en profundidad. Esta tesis aborda estos aspectos desde cuatro perspectivas diferentes: (1) Las estructuras de datos compartidas en la memoria del programa son una pesadilla para el análisis estático de programas. Trabajos recientes proponen una serie de condiciones de localidad para poder mantener de forma consistente información sobre los atributos de los objetos almacenados en memoria compartida, reemplazando éstos por variables locales no almacenadas en la memoria compartida. En esta tesis presentamos dos extensiones a estos trabajos: la primera es considerar, no sólo los accesos a los atributos, sino también los accesos a los elementos almacenados en arrays; la segunda se centra en los casos en los que las condiciones de localidad no se cumplen de forma incondicional, para lo cual, proponemos una técnica para encontrar las precondiciones necesarias para garantizar la consistencia de la información acerca de los datos almacenados en memoria. (2) El objetivo del análisis incremental es, dado un programa, los resultados de su análisis y una serie de cambios sobre el programa, obtener los nuevos resultados del análisis de la forma más eficiente posible, evitando reanalizar aquellos fragmentos de código que no se hayan visto afectados por los cambios. Los analizadores actuales todavía leen y analizan el programa completo de forma no incremental. Esta tesis presenta un análisis de coste incremental, que, dado un cambio en el programa, reconstruye la información sobre el coste del programa de todos los métodos afectados por el cambio de forma incremental. Para esto, proponemos (i) un algoritmo multi-dominio y de punto fijo que puede ser utilizado en todos los análisis globales necesarios para inferir el coste, y (ii) una novedosa forma de almacenar las expresiones de coste que nos permite reconstruir de forma incremental únicamente las funciones de coste de aquellos componentes afectados por el cambio. (3) Las garantías de coste obtenidas de forma automática por herramientas de análisis estático no son consideradas totalmente fiables salvo que la implementación de la herramienta o los resultados obtenidos sean verificados formalmente. Llevar a cabo el análisis de estas herramientas es una tarea titánica, ya que se trata de herramientas de gran tamaño y complejidad. En esta tesis nos centramos en el desarrollo de un marco formal para la verificación de las garantías de coste obtenidas por los analizadores en lugar de analizar las herramientas. Hemos implementado esta idea mediante la herramienta COSTA, un analizador de coste para programas Java y KeY, una herramienta de verificación de programas Java. De esta forma, COSTA genera las garantías de coste, mientras que KeY prueba la validez formal de los resultados obtenidos, generando de esta forma garantías de coste verificadas. (4) Hoy en día la concurrencia y los programas distribuidos son clave en el desarrollo de software. Los objetos concurrentes son un modelo de concurrencia asentado para el desarrollo de sistemas concurrentes. En este modelo, los objetos son las unidades de concurrencia y se comunican entre ellos mediante llamadas asíncronas a sus métodos. La distribución de las tareas sugiere que el análisis de coste debe inferir el coste de los diferentes componentes distribuidos por separado. En esta tesis proponemos un análisis de coste sensible a objetos que, utilizando los resultados obtenidos mediante un análisis de apunta-a, mantiene el coste de los diferentes componentes de forma independiente. Abstract Resource Analysis (a.k.a. Cost Analysis) tries to approximate the cost of executing programs as functions on their input data sizes and without actually having to execute the programs. While a powerful resource analysis framework on object-oriented programs existed before this thesis, advanced aspects to improve the efficiency, the accuracy and the reliability of the results of the analysis still need to be further investigated. This thesis tackles this need from the following four different perspectives. (1) Shared mutable data structures are the bane of formal reasoning and static analysis. Analyses which keep track of heap-allocated data are referred to as heap-sensitive. Recent work proposes locality conditions for soundly tracking field accesses by means of ghost non-heap allocated variables. In this thesis we present two extensions to this approach: the first extension is to consider arrays accesses (in addition to object fields), while the second extension focuses on handling cases for which the locality conditions cannot be proven unconditionally by finding aliasing preconditions under which tracking such heap locations is feasible. (2) The aim of incremental analysis is, given a program, its analysis results and a series of changes to the program, to obtain the new analysis results as efficiently as possible and, ideally, without having to (re-)analyze fragments of code that are not affected by the changes. During software development, programs are permanently modified but most analyzers still read and analyze the entire program at once in a non-incremental way. This thesis presents an incremental resource usage analysis which, after a change in the program is made, is able to reconstruct the upper-bounds of all affected methods in an incremental way. To this purpose, we propose (i) a multi-domain incremental fixed-point algorithm which can be used by all global analyses required to infer the cost, and (ii) a novel form of cost summaries that allows us to incrementally reconstruct only those components of cost functions affected by the change. (3) Resource guarantees that are automatically inferred by static analysis tools are generally not considered completely trustworthy, unless the tool implementation or the results are formally verified. Performing full-blown verification of such tools is a daunting task, since they are large and complex. In this thesis we focus on the development of a formal framework for the verification of the resource guarantees obtained by the analyzers, instead of verifying the tools. We have implemented this idea using COSTA, a state-of-the-art cost analyzer for Java programs and KeY, a state-of-the-art verification tool for Java source code. COSTA is able to derive upper-bounds of Java programs while KeY proves the validity of these bounds and provides a certificate. The main contribution of our work is to show that the proposed tools cooperation can be used for automatically producing verified resource guarantees. (4) Distribution and concurrency are today mainstream. Concurrent objects form a well established model for distributed concurrent systems. In this model, objects are the concurrency units that communicate via asynchronous method calls. Distribution suggests that analysis must infer the cost of the diverse distributed components separately. In this thesis we propose a novel object-sensitive cost analysis which, by using the results gathered by a points-to analysis, can keep the cost of the diverse distributed components separate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, novel imaging designs with a single optical surface (either refractive or reflective) are presented. In some of these designs, both object and image shapes are given but mapping from object to image is obtained as a result of the design. In other designs, not only the mapping is obtained in the design process, but also the shape of the object is found. In the examples considered, the image is virtual and located at infinity and is seen from known pupil, which can emulate a human eye. In the first introductory part, 2D designs have been done using three different design methods: a SMS design, a compound Cartesian oval surface, and a differential equation method for the limit case of small pupil. At the point-size pupil limit, it is proven that these three methods coincide. In the second part, previous 2D designs are extended to 3D by rotation and the astigmatism of the image has been studied. As an advanced variation, the differential equation method is used to provide the freedom to control the tangential rays and sagittal rays simultaneously. As a result, designs without astigmatism (at the small pupil limit) on a curved object surface have been obtained. Finally, this anastigmatic differential equation method has been extended to 3D for the general case, in which freeform surfaces are designed.