931 resultados para Emission tuning
Resumo:
Objective: The PEM Flex Solo II (Naviscan, Inc., San Diego, CA) is currently the only commercially-available positron emission mammography (PEM) scanner. This scanner does not apply corrections for count rate effects, attenuation or scatter during image reconstruction, potentially affecting the quantitative accuracy of images. This work measures the overall quantitative accuracy of the PEM Flex system, and determines the contributions of error due to count rate effects, attenuation and scatter. Materials and Methods: Gelatin phantoms were designed to simulate breasts of different sizes (4 – 12 cm thick) with varying uniform background activity concentration (0.007 – 0.5 μCi/cc), cysts and lesions (2:1, 5:1, 10:1 lesion-to-background ratios). The overall error was calculated from ROI measurements in the phantoms with a clinically relevant background activity concentration (0.065 μCi/cc). The error due to count rate effects was determined by comparing the overall error at multiple background activity concentrations to the error at 0.007 μCi/cc. A point source and cold gelatin phantoms were used to assess the errors due to attenuation and scatter. The maximum pixel values in gelatin and in air were compared to determine the effect of attenuation. Scatter was evaluated by comparing the sum of all pixel values in gelatin and in air. Results: The overall error in the background was found to be negative in phantoms of all thicknesses, with the exception of the 4-cm thick phantoms (0%±7%), and it increased with thickness (-34%±6% for the 12-cm phantoms). All lesions exhibited large negative error (-22% for the 2:1 lesions in the 4-cm phantom) which increased with thickness and with lesion-to-background ratio (-85% for the 10:1 lesions in the 12-cm phantoms). The error due to count rate in phantoms with 0.065 μCi/cc background was negative (-23%±6% for 4-cm thickness) and decreased with thickness (-7%±7% for 12 cm). Attenuation was a substantial source of negative error and increased with thickness (-51%±10% to -77% ±4% in 4 to 12 cm phantoms, respectively). Scatter contributed a relatively constant amount of positive error (+23%±11%) for all thicknesses. Conclusion: Applying corrections for count rate, attenuation and scatter will be essential for the PEM Flex Solo II to be able to produce quantitatively accurate images.
Resumo:
An Ensemble Kalman Filter is applied to assimilate observed tracer fields in various combinations in the Bern3D ocean model. Each tracer combination yields a set of optimal transport parameter values that are used in projections with prescribed CO2 stabilization pathways. The assimilation of temperature and salinity fields yields a too vigorous ventilation of the thermocline and the deep ocean, whereas the inclusion of CFC-11 and radiocarbon improves the representation of physical and biogeochemical tracers and of ventilation time scales. Projected peak uptake rates and cumulative uptake of CO2 by the ocean are around 20% lower for the parameters determined with CFC-11 and radiocarbon as additional target compared to those with salinity and temperature only. Higher surface temperature changes are simulated in the Greenland–Norwegian–Iceland Sea and in the Southern Ocean when CFC-11 is included in the Ensemble Kalman model tuning. These findings highlights the importance of ocean transport calibration for the design of near-term and long-term CO2 emission mitigation strategies and for climate projections.
Resumo:
PURPOSE Positron emission tomography (PET)∕computed tomography (CT) measurements on small lesions are impaired by the partial volume effect, which is intrinsically tied to the point spread function of the actual imaging system, including the reconstruction algorithms. The variability resulting from different point spread functions hinders the assessment of quantitative measurements in clinical routine and especially degrades comparability within multicenter trials. To improve quantitative comparability there is a need for methods to match different PET∕CT systems through elimination of this systemic variability. Consequently, a new method was developed and tested that transforms the image of an object as produced by one tomograph to another image of the same object as it would have been seen by a different tomograph. The proposed new method, termed Transconvolution, compensates for differing imaging properties of different tomographs and particularly aims at quantitative comparability of PET∕CT in the context of multicenter trials. METHODS To solve the problem of image normalization, the theory of Transconvolution was mathematically established together with new methods to handle point spread functions of different PET∕CT systems. Knowing the point spread functions of two different imaging systems allows determining a Transconvolution function to convert one image into the other. This function is calculated by convolving one point spread function with the inverse of the other point spread function which, when adhering to certain boundary conditions such as the use of linear acquisition and image reconstruction methods, is a numerically accessible operation. For reliable measurement of such point spread functions characterizing different PET∕CT systems, a dedicated solid-state phantom incorporating (68)Ge∕(68)Ga filled spheres was developed. To iteratively determine and represent such point spread functions, exponential density functions in combination with a Gaussian distribution were introduced. Furthermore, simulation of a virtual PET system provided a standard imaging system with clearly defined properties to which the real PET systems were to be matched. A Hann window served as the modulation transfer function for the virtual PET. The Hann's apodization properties suppressed high spatial frequencies above a certain critical frequency, thereby fulfilling the above-mentioned boundary conditions. The determined point spread functions were subsequently used by the novel Transconvolution algorithm to match different PET∕CT systems onto the virtual PET system. Finally, the theoretically elaborated Transconvolution method was validated transforming phantom images acquired on two different PET systems to nearly identical data sets, as they would be imaged by the virtual PET system. RESULTS The proposed Transconvolution method matched different PET∕CT-systems for an improved and reproducible determination of a normalized activity concentration. The highest difference in measured activity concentration between the two different PET systems of 18.2% was found in spheres of 2 ml volume. Transconvolution reduced this difference down to 1.6%. In addition to reestablishing comparability the new method with its parameterization of point spread functions allowed a full characterization of imaging properties of the examined tomographs. CONCLUSIONS By matching different tomographs to a virtual standardized imaging system, Transconvolution opens a new comprehensive method for cross calibration in quantitative PET imaging. The use of a virtual PET system restores comparability between data sets from different PET systems by exerting a common, reproducible, and defined partial volume effect.
Resumo:
Electronic tuning effects of substituents at the 4- and 8-positions of benzothiadiazole (BTD) within the fused tetrathiafulvalene–BTD donor–acceptor dyad have been studied. The electron acceptor strength of BTD is greatly increased by replacing Br with CN groups, extending the optical absorption of the small dyad into the near-IR region and importantly, the charge transport can be switched from p-type to ambipolar behaviour.
Resumo:
The rate of destruction of tropical forests continues to accelerate at an alarming rate contributing to an important fraction of overall greenhouse gas emissions. In recent years, much hope has been vested in the emerging REDD+ framework under the UN Framework Convention on Climate Change (UNFCCC), which aims at creating an international incentive system to reduce emissions from deforestation and forest degradation. This paper argues that in the absence of an international consensus on the design of results-based payments, “bottom-up” initiatives should take the lead and explore new avenues. It suggests that a call for tender for REDD+ credits might both assist in leveraging private investments and spending scarce public funds in a cost-efficient manner. The paper discusses the pros and cons of results-based approaches, provides an overview of the goals and principles that govern public procurement and discusses their relevance for the purchase of REDD+ credits, in particular within the ambit of the European Union.
Resumo:
The paper analyzes how to comply with an emission constraint, which restricts the use of an established energy technique, given the two options to save energy and to invest in two alternative energy techniques. These techniques differ in their deterioration rates and the investment lags of the corresponding capital stocks. Thus, the paper takes a medium-term perspective on climate change mitigation, where the time horizon is too short for technological change to occur, but long enough for capital stocks to accumulate and deteriorate. It is shown that, in general, only one of the two alternative techniques prevails in the stationary state, although, both techniques might be utilized during the transition phase. Hence, while in a static economy only one technique is efficient, this is not necessarily true in a dynamic economy.
Resumo:
We study the tuning curve of entangled photons generated by type-0 spontaneous parametric down-conversion in a periodically poled potassium titanyl phosphate crystal. We demonstrate the X-shaped spatiotemporal structure of the spectrum by means of measurements and numerical simulations. Experiments for different pump waists, crystal temperatures, and crystal lengths are in good agreement with numerical simulations.
Resumo:
Artificial pancreas is in the forefront of research towards the automatic insulin infusion for patients with type 1 diabetes. Due to the high inter- and intra-variability of the diabetic population, the need for personalized approaches has been raised. This study presents an adaptive, patient-specific control strategy for glucose regulation based on reinforcement learning and more specifically on the Actor-Critic (AC) learning approach. The control algorithm provides daily updates of the basal rate and insulin-to-carbohydrate (IC) ratio in order to optimize glucose regulation. A method for the automatic and personalized initialization of the control algorithm is designed based on the estimation of the transfer entropy (TE) between insulin and glucose signals. The algorithm has been evaluated in silico in adults, adolescents and children for 10 days. Three scenarios of initialization to i) zero values, ii) random values and iii) TE-based values have been comparatively assessed. The results have shown that when the TE-based initialization is used, the algorithm achieves faster learning with 98%, 90% and 73% in the A+B zones of the Control Variability Grid Analysis for adults, adolescents and children respectively after five days compared to 95%, 78%, 41% for random initialization and 93%, 88%, 41% for zero initial values. Furthermore, in the case of children, the daily Low Blood Glucose Index reduces much faster when the TE-based tuning is applied. The results imply that automatic and personalized tuning based on TE reduces the learning period and improves the overall performance of the AC algorithm.
Resumo:
The generation of collimated electron beams from metal double-gate nanotip arrays excited by near infrared laser pulses is studied. Using electromagnetic and particle tracking simulations, we showed that electron pulses with small rms transverse velocities are efficiently produced from nanotip arrays by laser-induced field emission with the laser wavelength tuned to surface plasmon polariton resonance of the stacked double-gate structure. The result indicates the possibility of realizing a metal nanotip array cathode that outperforms state-of-the-art photocathodes.
Resumo:
This minireview highlights three aspects of our recent work in the area of sugar modified oligonucleotide analogues. It provides an overview over recent results on the conformationally constrained analogue tricyclo-DNA with special emphasis of its antisense properties, it summarizes results on triple-helix forming oligodeoxynucleotides containing pyrrolidino-nucleosides with respect to DNA recognition via the dual recognition mode, and it highlights the advantageous application of the orthogonal oligonucleotidic pairing system homo-DNA in molecular beacons for DNA diagnostics
Resumo:
In the genus Petunia, distinct pollination syndromes may have evolved in association with bee-visitation (P. integrifolia spp.) or hawk moth-visitation (P. axillaris spp). We investigated the extent of congruence between floral fragrance and olfactory perception of the hawk moth Manduca sexta. Hawk moth pollinated P. axillaris releases high levels of several compounds compared to the bee-pollinated P. integrifolia that releases benzaldehyde almost exclusively. The three dominating compounds in P. axillaris were benzaldehyde, benzyl alcohol and methyl benzoate. In P. axillaris, benzenoids showed a circadian rhythm with an emission peak at night, which was absent from P. integrifolia. These characters were highly conserved among different P. axillaris subspecies and P. axillaris accessions, with some differences in fragrance composition. Electroantennogram (EAG) recordings using flower-blends of different wild Petunia species on female M. sexta antennae showed that P. axillaris odours elicited stronger responses than P. integrifolia odours. EAG responses were highest to the three dominating compounds in the P. axillaris flower odours. Further, EAG responses to odour-samples collected from P. axillaris flowers confirmed that odours collected at night evoked stronger responses from M. sexta than odours collected during the day. These results show that timing of odour emissions by P. axillaris is in tune with nocturnal hawk moth activity and that flower-volatile composition is adapted to the antennal perception of these pollinators.
Resumo:
The functions of ribosomes in translation are complex and involve different types of activities critical for decoding the genetic code, linkage of amino acids via amide bonds to form polypeptide chains, as well as the release and proper targeting of the synthesized protein. Non-protein-coding RNAs (ncRNAs) have been recognized to be crucial in establishing regulatory networks.1 However all of the recently discovered ncRNAs involved in translation regulation target the mRNA rather than the ribosome. The main goal of this project is to identify potential novel ncRNAs that directly bind and possibly regulate the ribosome during protein biosynthesis. To address this question we applied various stress conditions to the archaeal model organism Haloferax volcanii and deep-sequenced the ribosome-associated small ncRNA interactome. In total we identified 6.250 ncRNA candidates. Significantly, we observed the emersed presence of tRNA-derived fragments (tRFs). These tRFs have been identified in all domains of life and represent a growing, yet functionally poorly understood, class of ncRNAs. Here we present evidence that tRFs from H. volcanii directly bind to ribosomes. In the presented genomic screen of the ribosome-associated RNome a 26 residue long fragment originating from the 5’ part of valine tRNA was by far the most abundant tRF. The Val-tRF is processed in a stress- dependent manner and was found to primarily target the small ribosomal subunit in vitro and in vivo. As a consequence of ribosome binding, Val-tRF reduces protein synthesis by interfering with peptidyl transferase activity. Therefore this tRF functions as ribosome-bound small ncRNA capable of regulating gene expression in H. volcanii under environmental stress conditions probably by fine-tuning the rate of protein production.2 Currently we are investigating the binding site of this tRF on the 30S subunit in more detail.
Resumo:
The T2K long-baseline neutrino oscillation experiment in Japan needs precise predictions of the initial neutrino flux. The highest precision can be reached based on detailed measurements of hadron emission from the same target as used by T2K exposed to a proton beam of the same kinetic energy of 30 GeV. The corresponding data were recorded in 2007-2010 by the NA61/SHINE experiment at the CERN SPS using a replica of the T2K graphite target. In this paper details of the experiment, data taking, data analysis method and results from the 2007 pilot run are presented. Furthermore, the application of the NA61/SHINE measurements to the predictions of the T2K initial neutrino flux is described and discussed.