928 resultados para Sinogram-affirmed iterative reconstruction
Resumo:
This paper presents our investigation on iterativedecoding performances of some sparse-graph codes on block-fading Rayleigh channels. The considered code ensembles are standard LDPC codes and Root-LDPC codes, first proposed in and shown to be able to attain the full transmission diversity. We study the iterative threshold performance of those codes as a function of fading gains of the transmission channel and propose a numerical approximation of the iterative threshold versus fading gains, both both LDPC and Root-LDPC codes.Also, we show analytically that, in the case of 2 fading blocks,the iterative threshold root of Root-LDPC codes is proportional to (α1 α2)1, where α1 and α2 are corresponding fading gains.From this result, the full diversity property of Root-LDPC codes immediately follows.
Resumo:
We evaluated 25 protocol variants of 14 independent computational methods for exon identification, transcript reconstruction and expression-level quantification from RNA-seq data. Our results show that most algorithms are able to identify discrete transcript components with high success rates but that assembly of complete isoform structures poses a major challenge even when all constituent elements are identified. Expression-level estimates also varied widely across methods, even when based on similar transcript models. Consequently, the complexity of higher eukaryotic genomes imposes severe limitations on transcript recall and splice product discrimination that are likely to remain limiting factors for the analysis of current-generation RNA-seq data.
Resumo:
AbstractFor a wide range of environmental, hydrological, and engineering applications there is a fast growing need for high-resolution imaging. In this context, waveform tomographic imaging of crosshole georadar data is a powerful method able to provide images of pertinent electrical properties in near-surface environments with unprecedented spatial resolution. In contrast, conventional ray-based tomographic methods, which consider only a very limited part of the recorded signal (first-arrival traveltimes and maximum first-cycle amplitudes), suffer from inherent limitations in resolution and may prove to be inadequate in complex environments. For a typical crosshole georadar survey the potential improvement in resolution when using waveform-based approaches instead of ray-based approaches is in the range of one order-of- magnitude. Moreover, the spatial resolution of waveform-based inversions is comparable to that of common logging methods. While in exploration seismology waveform tomographic imaging has become well established over the past two decades, it is comparably still underdeveloped in the georadar domain despite corresponding needs. Recently, different groups have presented finite-difference time-domain waveform inversion schemes for crosshole georadar data, which are adaptations and extensions of Tarantola's seminal nonlinear generalized least-squares approach developed for the seismic case. First applications of these new crosshole georadar waveform inversion schemes on synthetic and field data have shown promising results. However, there is little known about the limits and performance of such schemes in complex environments. To this end, the general motivation of my thesis is the evaluation of the robustness and limitations of waveform inversion algorithms for crosshole georadar data in order to apply such schemes to a wide range of real world problems.One crucial issue to making applicable and effective any waveform scheme to real-world crosshole georadar problems is the accurate estimation of the source wavelet, which is unknown in reality. Waveform inversion schemes for crosshole georadar data require forward simulations of the wavefield in order to iteratively solve the inverse problem. Therefore, accurate knowledge of the source wavelet is critically important for successful application of such schemes. Relatively small differences in the estimated source wavelet shape can lead to large differences in the resulting tomograms. In the first part of my thesis, I explore the viability and robustness of a relatively simple iterative deconvolution technique that incorporates the estimation of the source wavelet into the waveform inversion procedure rather than adding additional model parameters into the inversion problem. Extensive tests indicate that this source wavelet estimation technique is simple yet effective, and is able to provide remarkably accurate and robust estimates of the source wavelet in the presence of strong heterogeneity in both the dielectric permittivity and electrical conductivity as well as significant ambient noise in the recorded data. Furthermore, our tests also indicate that the approach is insensitive to the phase characteristics of the starting wavelet, which is not the case when directly incorporating the wavelet estimation into the inverse problem.Another critical issue with crosshole georadar waveform inversion schemes which clearly needs to be investigated is the consequence of the common assumption of frequency- independent electromagnetic constitutive parameters. This is crucial since in reality, these parameters are known to be frequency-dependent and complex and thus recorded georadar data may show significant dispersive behaviour. In particular, in the presence of water, there is a wide body of evidence showing that the dielectric permittivity can be significantly frequency dependent over the GPR frequency range, due to a variety of relaxation processes. The second part of my thesis is therefore dedicated to the evaluation of the reconstruction limits of a non-dispersive crosshole georadar waveform inversion scheme in the presence of varying degrees of dielectric dispersion. I show that the inversion algorithm, combined with the iterative deconvolution-based source wavelet estimation procedure that is partially able to account for the frequency-dependent effects through an "effective" wavelet, performs remarkably well in weakly to moderately dispersive environments and has the ability to provide adequate tomographic reconstructions.
Resumo:
The defaults of Philip II have attained mythical status as the origin of sovereign debt crises. Four times during his reign the king failed to honor his debts and had to renegotiate borrowing contracts. In this paper, we reassess the fiscal position of Habsburg Spain. New archival evidence allows us to derive comprehensive estimates of debt and revenue. These show that primary surpluses were sufficient to make the king's debt sustainable in most scenarios. Spain's debt burden was manageable up to the 1580s, and its fiscal position only deteriorated for good after the defeat of the "Invincible Armada." We also estimate fiscal policy reaction functions, and show that Spain under the Habsburgs was at least as "responsible" as the US in the 20th century or as Britain in the 18th century. Our results suggest that the outcome of uncertain events such as wars may influence on a history of default more than strict adherence to fiscal rules.
Resumo:
The aim of this study was to compare the diagnostic efficiency of plain film and spiral CT examinations with 3D reconstructions of 42 tibial plateau fractures and to assess the accuracy of these two techniques in the pre-operative surgical plan in 22 cases. Forty-two tibial plateau fractures were examined with plain film (anteroposterior, lateral, two obliques) and spiral CT with surface-shaded-display 3D reconstructions. The Swiss AO-ASIF classification system of bone fracture from Muller was used. In 22 cases the surgical plans and the sequence of reconstruction of the fragments were prospectively determined with both techniques, successively, and then correlated with the surgical reports and post-operative plain film. The fractures were underestimated with plain film in 18 of 42 cases (43%). Due to the spiral CT 3D reconstructions, and precise pre-operative information, the surgical plans based on plain film were modified and adjusted in 13 cases among 22 (59%). Spiral CT 3D reconstructions give a better and more accurate demonstration of the tibial plateau fracture and allows a more precise pre-operative surgical plan.
Resumo:
One of the standard tools used to understand the processes shaping trait evolution along the branches of a phylogenetic tree is the reconstruction of ancestral states (Pagel 1999). The purpose is to estimate the values of the trait of interest for every internal node of a phylogenetic tree based on the trait values of the extant species, a topology and, depending on the method used, branch lengths and a model of trait evolution (Ronquist 2004). This approach has been used in a variety of contexts such as biogeography (e.g., Nepokroeff et al. 2003, Blackburn 2008), ecological niche evolution (e.g., Smith and Beaulieu 2009, Evans et al. 2009) and metabolic pathway evolution (e.g., Gabaldón 2003, Christin et al. 2008). Investigations of the factors affecting the accuracy with which ancestral character states can be reconstructed have focused in particular on the choice of statistical framework (Ekman et al. 2008) and the selection of the best model of evolution (Cunningham et al. 1998, Mooers et al. 1999). However, other potential biases affecting these methods, such as the effect of tree shape (Mooers 2004), taxon sampling (Salisbury and Kim 2001) as well as reconstructing traits involved in species diversification (Goldberg and Igić 2008), have also received specific attention. Most of these studies conclude that ancestral character states reconstruction is still not perfect, and that further developments are necessary to improve its accuracy (e.g., Christin et al. 2010). Here, we examine how different estimations of branch lengths affect the accuracy of ancestral character state reconstruction. In particular, we tested the effect of using time-calibrated versus molecular branch lengths and provide guidelines to select the most appropriate branch lengths to reconstruct the ancestral state of a trait.
Resumo:
A crucial method for investigating patients with coronary artery disease (CAD) is the calculation of the left ventricular ejection fraction (LVEF). It is, consequently, imperative to precisely estimate the value of LVEF--a process that can be done with myocardial perfusion scintigraphy. Therefore, the present study aimed to establish and compare the estimation performance of the quantitative parameters of the reconstruction methods filtered backprojection (FBP) and ordered-subset expectation maximization (OSEM). METHODS: A beating-heart phantom with known values of end-diastolic volume, end-systolic volume, and LVEF was used. Quantitative gated SPECT/quantitative perfusion SPECT software was used to obtain these quantitative parameters in a semiautomatic mode. The Butterworth filter was used in FBP, with the cutoff frequencies between 0.2 and 0.8 cycles per pixel combined with the orders of 5, 10, 15, and 20. Sixty-three reconstructions were performed using 2, 4, 6, 8, 10, 12, and 16 OSEM subsets, combined with several iterations: 2, 4, 6, 8, 10, 12, 16, 32, and 64. RESULTS: With FBP, the values of end-diastolic, end-systolic, and the stroke volumes rise as the cutoff frequency increases, whereas the value of LVEF diminishes. This same pattern is verified with the OSEM reconstruction. However, with OSEM there is a more precise estimation of the quantitative parameters, especially with the combinations 2 iterations × 10 subsets and 2 iterations × 12 subsets. CONCLUSION: The OSEM reconstruction presents better estimations of the quantitative parameters than does FBP. This study recommends the use of 2 iterations with 10 or 12 subsets for OSEM and a cutoff frequency of 0.5 cycles per pixel with the orders 5, 10, or 15 for FBP as the best estimations for the left ventricular volumes and ejection fraction quantification in myocardial perfusion scintigraphy.
Resumo:
OBJECT: To study a scan protocol for coronary magnetic resonance angiography based on multiple breath-holds featuring 1D motion compensation and to compare the resulting image quality to a navigator-gated free-breathing acquisition. Image reconstruction was performed using L1 regularized iterative SENSE. MATERIALS AND METHODS: The effects of respiratory motion on the Cartesian sampling scheme were minimized by performing data acquisition in multiple breath-holds. During the scan, repetitive readouts through a k-space center were used to detect and correct the respiratory displacement of the heart by exploiting the self-navigation principle in image reconstruction. In vivo experiments were performed in nine healthy volunteers and the resulting image quality was compared to a navigator-gated reference in terms of vessel length and sharpness. RESULTS: Acquisition in breath-hold is an effective method to reduce the scan time by more than 30 % compared to the navigator-gated reference. Although an equivalent mean image quality with respect to the reference was achieved with the proposed method, the 1D motion compensation did not work equally well in all cases. CONCLUSION: In general, the image quality scaled with the robustness of the motion compensation. Nevertheless, the featured setup provides a positive basis for future extension with more advanced motion compensation methods.
Resumo:
The defaults of Philip II have attained mythical status as the origin of sovereigndebt crises. We reassess the fiscal position of Habsburg Castile, derivingcomprehensive estimates of revenue, debt, and expenditure from new archivaldata. The king s debts were sustainable. Primary surpluses were large and rising.Debt-to-revenue ratios remained broadly unchanged during Philip s reign.Castilian finances in the sixteenth century compare favorably with those of otherearly modern fiscal states at the height of their imperial ambitions, includingBritain. The defaults of Philip II therefore reflected short-term liquidity crises,and were not a sign of unsustainable debts.
Resumo:
Since 1998 the highly polluted Havana Bay ecosystem has been the subject of a mitigation program. In order to determine whether pollution-reduction strategies were effective, we have evaluated the historical trends of pollution recorded in sediments of the Bay. A sediment core was dated radiometrically using natural and artificial fallout radionuclides. An irregularity in the (210)Pb record was caused by an episode of accelerated sedimentation. This episode was dated to occur in 1982, a year coincident with the heaviest rains reported in Havana over the XX century. Peaks of mass accumulation rates (MAR) were associated with hurricanes and intensive rains. In the past 60 years, these maxima are related to strong El Niño periods, which are known to increase rainfall in the north Caribbean region. We observed a steady increase of pollution (mainly Pb, Zn, Sn, and Hg) since the beginning of the century to the mid 90s, with enrichment factors as high as 6. MAR and pollution decreased rapidly after the mid 90s, although some trace metal levels remain high. This reduction was due to the integrated coastal zone management program introduced in the late 90s, which dismissed catchment erosion and pollution.
Resumo:
Objective: Reconstruction of alar structures of the nose remains difficult. The result has to be not only functional but also aesthetic. Different solutions to reconstruct alar defects are feasible. A good result that meets the specific demands on stability, aesthetics, and stable architecture without shrinkage of the area is not easily achieved. Method: A perichondrial cutaneous graft (PCCG), a graft consisting of a perichondral layer, fatty tissue, and skin that is harvested retroauriculary, is combined with an attached cartilage strip. Case Result: A 72-year-old patient suffering from basal cell carcinoma of the ala of the nose underwent the reconstructive procedure with a good result in 1 year in terms of stability, color match, and graft take. Conclusion: First, a strip of cartilage had been included in a PCCG where tumor resection required sacrifice of more than 50% of the alar rim. The case shows that one can consider a cartilage strip-enhanced PCCG graft to reconstruct alar defects.
Resumo:
INTRODUCTION: Reconstructions of the fronto-orbital area remain a challenge to the reconstructive surgeon, due to the functional and esthetic impact. OBSERVATION: The authors present a case of a complex fronto-orbital reconstruction with a PEEK (PolyEtherEtherKetone) implant, associated with a skin expansion. DISCUSSION: With a follow-up of over three years, the cosmetic result is excellent. The authors believe that this technique is reliable, fast with long-term good results.
Resumo:
We present an open-source ITK implementation of a directFourier method for tomographic reconstruction, applicableto parallel-beam x-ray images. Direct Fourierreconstruction makes use of the central-slice theorem tobuild a polar 2D Fourier space from the 1D transformedprojections of the scanned object, that is resampled intoa Cartesian grid. Inverse 2D Fourier transform eventuallyyields the reconstructed image. Additionally, we providea complex wrapper to the BSplineInterpolateImageFunctionto overcome ITKâeuro?s current lack for image interpolatorsdealing with complex data types. A sample application ispresented and extensively illustrated on the Shepp-Loganhead phantom. We show that appropriate input zeropaddingand 2D-DFT oversampling rates together with radial cubicb-spline interpolation improve 2D-DFT interpolationquality and are efficient remedies to reducereconstruction artifacts.