332 resultados para incoherent correlator
Resumo:
We consider numerical data for the lattice Landau gluon propagator obtained at very large lattice volumes in three-dimensional pure SU(2) Yang-Mills gauge theory (YM32). We find that the temporal correlator C(t) shows an oscillatory pattern and is negative for several values of t. This is an explicit violation of reflection positivity and can be related to gluon confinement. We also obtain a good fit for this quantity in the whole time interval using a sum of Stingl-like propagators.
Resumo:
We derive simple and physically transparent expressions for the contribution of the strong interaction to one-nucleon-removal processes in peripheral relativistic heavy-ion collisions. The coherent contribution, i.e., the excitation of a giant dipole resonance via meson exchange, is shown to be negligible as well as the interference between Coulomb and nuclear excitation. The incoherent nucleon-knockout contribution is also derived suggesting the nature of the nuclear interaction in this class of processes. We also justify the simple formulae used to fit the data of the E814 Collaboration. © 1995 Elseier Science B.V. All rights reserved.
Resumo:
The mechanism of forward angle incoherent photoproduction of pseudoscalar mesons off nuclei is revisited via the time-dependent multicollisional Monte Carlo (MCMC) intranuclear cascade model. Our results-combined with recent developments to address coherent photoproduction-reproduce with good accuracy recent JLab data of pi(0) photoproduction from carbon and lead at an average photon energy k similar to 5.2 GeV. For the case of. photoproduction, our results for k = 9 GeV suggest that future measurements to extract the eta ->gamma gamma decay width via the Primakoff method should be focused on light nuclei, where the disentanglement between the Coulomb and strong amplitudes is more easily achieved. The prospects to use heavy nuclei data to access the unknown eta N cross section in cold nuclear matter are also presented.
Resumo:
Development of methods for rapid screening and stratification of subjects after exposure is an integral part of countermeasures against radiation. The potential demographic and exposure history-related heterogeneity of exposed populations warrants robust biomarkers that withstand and reflect such differences. In this study, the effect of aging and repeated exposure on the metabolic response to sublethal irradiation was examined in mice using UPLC-ESI-QTOF mass spectrometry. Aging attenuated postexposure elevation in excretions of DNA damage biomarkers as well as N(1)-acetylspermidine. Although N(1)-acetylspermidine and 2'-deoxyuridine elevation was highly correlated in all age groups, xanthine and N(1)-acetylspermidine elevation was poorly correlated in older mice. These results may reflect the established decline in DNA damage-repair efficiency associated with aging and indicate a novel role for polyamine metabolism in the process. Although repeated irradiation at long intervals did not affect the elevation of N(1)-acetylspermidine, 2'-deoxyuridine, and xanthine, it did significantly attenuate the elevation of 2'-deoxycytidine and thymidine compared to a single exposure. However, these biomarkers were found to identify exposed subjects with accuracy ranging from 82% (xanthosine) to 98% (2'-deoxyuridine), irrespective of their age and exposure history. This indicates that metabolic biomarkers can act as robust noninvasive signatures of sublethal radiation exposure.
Resumo:
PURPOSE To systematically evaluate the dependence of intravoxel-incoherent-motion (IVIM) parameters on the b-value threshold separating the perfusion and diffusion compartment, and to implement and test an algorithm for the standardized computation of this threshold. METHODS Diffusion weighted images of the upper abdomen were acquired at 3 Tesla in eleven healthy male volunteers with 10 different b-values and in two healthy male volunteers with 16 different b-values. Region-of-interest IVIM analysis was applied to the abdominal organs and skeletal muscle with a systematic increase of the b-value threshold for computing pseudodiffusion D*, perfusion fraction Fp , diffusion coefficient D, and the sum of squared residuals to the bi-exponential IVIM-fit. RESULTS IVIM parameters strongly depended on the choice of the b-value threshold. The proposed algorithm successfully provided optimal b-value thresholds with the smallest residuals for all evaluated organs [s/mm2]: e.g., right liver lobe 20, spleen 20, right renal cortex 150, skeletal muscle 150. Mean D* [10(-3) mm(2) /s], Fp [%], and D [10(-3) mm(2) /s] values (±standard deviation) were: right liver lobe, 88.7 ± 42.5, 22.6 ± 7.4, 0.73 ± 0.12; right renal cortex: 11.5 ± 1.8, 18.3 ± 2.9, 1.68 ± 0.05; spleen: 41.9 ± 57.9, 8.2 ± 3.4, 0.69 ± 0.07; skeletal muscle: 21.7 ± 19.0; 7.4 ± 3.0; 1.36 ± 0.04. CONCLUSION IVIM parameters strongly depend upon the choice of the b-value threshold used for computation. The proposed algorithm may be used as a robust approach for IVIM analysis without organ-specific adaptation. Magn Reson Med, 2014. © 2014 Wiley Periodicals, Inc.
Resumo:
Quasielastic incoherent neutron scattering from hydrogen atoms, which are distributed nearly homogeneously in biological molecules, allows the investigation of diffusive motions occurring on the pico- to nanosecond time scale. A quasielastic incoherent neutron scattering study was performed on the integral membrane protein bacteriorhodopsin (BR), which is a light-driven proton pump in Halobacterium salinarium. BR is embedded in lipids, forming patches in the cell membrane of the organism, which are the so called purple membranes (PMs). Measurements were carried out at room temperature on oriented PM-stacks hydrated at two different levels (low hydration, h = 0.03 g of D2O per g of PM; high hydration, h = 0.28 g of D2O per g of PM) using time-of-flight spectrometers. From the measured spectra, different diffusive components were identified and analyzed with respect to the influence of hydration. This study supports the idea that a decrease in hydration results in an appreciable decrease in internal molecular flexibility of the protein structure. Because it is known from studies on the function of BR that the pump activity is reduced if the hydration level of the protein is insufficient, we conclude that the observed diffusive motions are essential for the function of this protein. A detailed analysis and classification of the different kinds of diffusive motions, predominantly occurring in PMs under physiological conditions, is presented.
Resumo:
Many attempts have been made to overcome problems involved in character recognition which have resulted in the manufacture of character reading machines. An investigation into a new approach to character recognition is described. Features for recognition are Fourier coefficients. These are generated optically by convolving characters with periodic gratings. The development of hardware to enable automatic measurement of contrast and position of periodic shadows produced by the convolution is described. Fourier coefficients of character sets were measured, many of which are tabulated. Their analysis revealed that a few low frequency sampling points could be selected to recognise sets of numerals. Limited treatment is given to show the effect of type face variations on the values of coefficients which culminated in the location of six sampling frequencies used as features to recognise numerals in two type fonts. Finally, the construction of two character recognition machines is compared and contrasted. The first is a pilot plant based on a test bed optical Fourier analyser, while the second is a more streamlined machine d(3signed for high speed reading. Reasons to indicate that the latter machine would be the most suitable to adapt for industrial and commercial applications are discussed.
Resumo:
Random number generation is a central component of modern information technology, with crucial applications in ensuring communications and information security. The development of new physical mechanisms suitable to directly generate random bit sequences is thus a subject of intense current research, with particular interest in alloptical techniques suitable for the generation of data sequences with high bit rate. One such promising technique that has received much recent attention is the chaotic semiconductor laser systems producing high quality random output as a result of the intrinsic nonlinear dynamics of its architecture [1]. Here we propose a novel complementary concept of all-optical technique that might dramatically increase the generation rate of random bits by using simultaneously multiple spectral channels with uncorrelated signals - somewhat similar to use of wave-division-multiplexing in communications. We propose to exploit the intrinsic nonlinear dynamics of extreme spectral broadening and supercontinuum (SC) generation in optical fibre, a process known to be often associated with non-deterministic fluctuations [2]. In this paper, we report proof-of concept results indicating that the fluctuations in highly nonlinear fibre SC generation can potentially be used for random number generation.
Resumo:
We demonstrate experimentally a novel and simple tunable all-optical incoherent negative-tap fiber-optic transversal filter based on a distribution feedback laser diode and high reflection fiber Bragg gratings (FBGs). In this filter, variable time delay is provided by cascaded high reflection fiber Bragg gratings (FBGs), and the tuning of the filter is realized by tuning different FBG to match the fixed carrier wavelength, or adjusting the carrier wavelength to fit different FBG. The incoherent negative tapping is realized by using the carrier depletion effect in a distribution feedback laser diode.
Resumo:
OBJECTIVE: Intravoxel incoherent motion (IVIM) is an MRI technique with potential applications in measuring brain tumor perfusion, but its clinical impact remains to be determined. We assessed the usefulness of IVIM-metrics in predicting survival in newly diagnosed glioblastoma. METHODS: Fifteen patients with glioblastoma underwent MRI including spin-echo echo-planar DWI using 13 b-values ranging from 0 to 1000 s/mm2. Parametric maps for diffusion coefficient (D), pseudodiffusion coefficient (D*), and perfusion fraction (f) were generated for contrast-enhancing regions (CER) and non-enhancing regions (NCER). Regions of interest were manually drawn in regions of maximum f and on the corresponding dynamic susceptibility contrast images. Prognostic factors were evaluated by Kaplan-Meier survival and Cox proportional hazards analyses. RESULTS: We found that fCER and D*CER correlated with rCBFCER. The best cutoffs for 6-month survival were fCER>9.86% and D*CER>21.712 x10-3mm2/s (100% sensitivity, 71.4% specificity, 100% and 80% positive predictive values, and 80% and 100% negative predictive values; AUC:0.893 and 0.857, respectively). Treatment yielded the highest hazard ratio (5.484; 95% CI: 1.162-25.88; AUC: 0.723; P = 0.031); fCER combined with treatment predicted survival with 100% accuracy. CONCLUSIONS: The IVIM-metrics fCER and D*CER are promising biomarkers of 6-month survival in newly diagnosed glioblastoma.
Resumo:
We show that coherent phase effects may play a relevant role in the nonlinear propagation of partially incoherent waves, which lead to unexpected processes of condensation, or incoherent soliton generation in instantaneous response nonlinear media. © 2005 OSA.
Resumo:
The development of the creative industries “proposition” has caused a great deal of controversy. Even as it has been examined and adopted in several, quite diverse, jurisdictions as a policy language seeking to respond to both creative production and consumption in new economic conditions, it is subject to at times withering critique from within academic media, cultural and communication studies. It is held to promote a simplistic narrative of the merging of culture and economics and represents incoherent policy; the data sources are suspect and underdeveloped; there is a utopianization of “creative” labor; and a benign globalist narrative of the adoption of the idea. This article looks at some of these critiques of creative industries idea and argues against them.
Resumo:
This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.