967 resultados para Detectors.
Resumo:
In this work, a wide analysis of local search multiuser detection (LS-MUD) for direct sequence/code division multiple access (DS/CDMA) systems under multipath channels is carried out considering the performance-complexity trade-off. It is verified the robustness of the LS-MUD to variations in loading, E(b)/N(0), near-far effect, number of fingers of the Rake receiver and errors in the channel coefficients estimates. A compared analysis of the bit error rate (BER) and complexity trade-off is accomplished among LS, genetic algorithm (GA) and particle swarm optimization (PSO). Based on the deterministic behavior of the LS algorithm, it is also proposed simplifications over the cost function calculation, obtaining more efficient algorithms (simplified and combined LS-MUD versions) and creating new perspectives for the MUD implementation. The computational complexity is expressed in terms of the number of operations in order to converge. Our conclusion pointed out that the simplified LS (s-LS) method is always more efficient, independent of the system conditions, achieving a better performance with a lower complexity than the others heuristics detectors. Associated to this, the deterministic strategy and absence of input parameters made the s-LS algorithm the most appropriate for the MUD problem. (C) 2008 Elsevier GmbH. All rights reserved.
Resumo:
This work aims at proposing the use of the evolutionary computation methodology in order to jointly solve the multiuser channel estimation (MuChE) and detection problems at its maximum-likelihood, both related to the direct sequence code division multiple access (DS/CDMA). The effectiveness of the proposed heuristic approach is proven by comparing performance and complexity merit figures with that obtained by traditional methods found in literature. Simulation results considering genetic algorithm (GA) applied to multipath, DS/CDMA and MuChE and multi-user detection (MuD) show that the proposed genetic algorithm multi-user channel estimation (GAMuChE) yields a normalized mean square error estimation (nMSE) inferior to 11%, under slowly varying multipath fading channels, large range of Doppler frequencies and medium system load, it exhibits lower complexity when compared to both maximum likelihood multi-user channel estimation (MLMuChE) and gradient descent method (GrdDsc). A near-optimum multi-user detector (MuD) based on the genetic algorithm (GAMuD), also proposed in this work, provides a significant reduction in the computational complexity when compared to the optimum multi-user detector (OMuD). In addition, the complexity of the GAMuChE and GAMuD algorithms were (jointly) analyzed in terms of number of operations necessary to reach the convergence, and compared to other jointly MuChE and MuD strategies. The joint GAMuChE-GAMuD scheme can be regarded as a promising alternative for implementing third-generation (3G) and fourth-generation (4G) wireless systems in the near future. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
The carotenoid composition was evaluated during ripening of papaya cv. `Golden` under untreated (control) conditions and treated with ethylene and 1-methylcyclopropene (1-MCP). At the end of the experiments, the total carotenoid content in the control group (2194.4 mu g/100 g) was twice as high as that found in ethylene (1018.1 mu g/100 g) and 1-MCP (654.5 mu g/100 g) gas-treated samples. Separation of 21 carotenoids by HPLC connected to photodiode array and mass spectrometry detectors showed that no minor carotenoids seemed to be particularly favoured by the treatments. Lycopene was the major carotenoid in all untreated and gas-treated samples, ranging from 461.5 to 1321.6 mu g/100 g at the end of the experiments. According to the proposed biosynthetic pathway, lycopene is the central compound, since it is the most abundant carotenoid indicating a high stimulation of its upstream steps during ripening, and it is the source for the synthesis of other derivative compounds, such as beta-cryptoxanthin. The influence of both gas treatments on the carotenoid biosynthetic pathway was considered. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Cultivar, growing conditions and geographical origin are factors that influence the carotenoid composition in fruits. Because the loquat cultivars evaluated in this study, CentenAria, Mizauto, Mizuho, Mizumo and Nectar de Cristal, have not previously been investigated, the present work was carried out to determine and compare the carotenoid composition of these five loquat cultivars, by applying high-performance liquid chromatography connected to a photodiode array and mass spectrometry detectors (HPLC-PDA-MS/MS). Twenty-five carotenoids were separated on a C(30) column, and 23 of them were identified. All-trans-beta-carotene (19-55%), all-trans-beta-cryptoxanthin (18-28%), 5,6:5`,6`-diepoxy-beta-cryptoxantilin (9-18%) and 5,6-epoxy-beta-cryptoxanthin (7-10%) were the main carotenoids. The total carotenoid content ranged from 196 mu g/100 g (cv. Nectar de Cristal) to 3020 mu g/100 g (CV. Mizumo). The carotenoid profile of cv. Nectar de Cristal was different from the other cultivars, which was in agreement with its cream pulp colour, in contrast to the other four cultivars with orange pulp colour. Cultivars Mizauto, Mizuho, Mizumo and CentenAria showed provitamin A values between 89 and 162 mu g RAE/100 g, and can be considered good source of this provitamin. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Quantum computers promise to increase greatly the efficiency of solving problems such as factoring large integers, combinatorial optimization and quantum physics simulation. One of the greatest challenges now is to implement the basic quantum-computational elements in a physical system and to demonstrate that they can be reliably and scalably controlled. One of the earliest proposals for quantum computation is based on implementing a quantum bit with two optical modes containing one photon. The proposal is appealing because of the ease with which photon interference can be observed. Until now, it suffered from the requirement for non-linear couplings between optical modes containing few photons. Here we show that efficient quantum computation is possible using only beam splitters, phase shifters, single photon sources and photo-detectors. Our methods exploit feedback from photo-detectors and are robust against errors from photon loss and detector inefficiency. The basic elements are accessible to experimental investigation with current technology.
Resumo:
We demonstrate a contradiction of quantum mechanics with local hidden variable theories for continuous quadrature phase amplitude (position and momentum) measurements. For any quantum state, this contradiction is lost for situations where the quadrature phase amplitude results are always macroscopically distinct. We show that for optical realizations of this experiment, where one uses homodyne detection techniques to perform the quadrature phase amplitude measurement, one has an amplification prior to detection, so that macroscopic fields are incident on photodiode detectors. The high efficiencies of such detectors may open a way for a loophole-free test of local hidden variable theories.
Resumo:
Recent research (Kuhl, 1991) has suggested that the internal structure of vowel categories is graded in terms of stimulus goodness. It has been proposed that a best instance stimulus reflects a central point or prototype, which effectively renders within-category members perceptually more similar. Discrimination experiments suggest a nonlinear relationship between acoustic and perceptual space near category centers (Iverson & Kuhl, 1995b). This phenomenon has been described as the perceptual magnet effect. The present study investigated the presence of the perceptual magnet effect in five Australian vowel categories. Australian English speakers identified, rated, and discriminated between a pool of 32 vowel stimuli that varied in F1 and F2 values. The results from Experiments 1 and 2 showed that subjects were able to judge the quality and identity of each stimulus and that a general grading of stimulus quality was reported. This was not symmetrical, and the subjects' responses varied considerably. In Experiment 3, closer control of the methodology in the discrimination task and of contextual factors influencing the test materials was exercised. Despite this, evidence of the warping of perceptual space in discrimination data was not found. In general, these results do not provide support for the existence of the perceptual magnet effect, and explanations for this finding are discussed.
Resumo:
In this work, we describe the process of teleportation between Alice in an inertial frame, and Rob who is in uniform acceleration with respect to Alice. The fidelity of the teleportation is reduced due to Davies-Unruh radiation in Rob's frame. In so far as teleportation is a measure of entanglement, our results suggest that quantum entanglement is degraded in non-inertial frames. We discuss this reduction in fidelity for both bosonic and fermionic resources.
Resumo:
New differential linear coherent scattering coefficient, mu(CS), data for four biological tissue types (fat pork, tendon chicken, adipose and fibroglandular human breast tissues) covering a large momentum transfer interval (0.07 <= q <= 70.5 nm(-1)), resulted from combining WAXS and SAXS data, are presented in order to emphasize the need to update the default data-base by including the molecular interference and the large-scale arrangements effect. The results showed that the differential linear coherent scattering coefficient demonstrates influence of the large-scale arrangement, mainly due to collagen fibrils for tendon chicken and fibroglandular breast samples, and triacylglycerides for fat pork and adipose breast samples at low momentum transfer region. While, at high momentum transfer, the mu(CS) reflects effects of molecular interference related to water for tendon chicken and fibroglandular samples and, fatty acids for fat pork and adipose samples. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Objectives This study was designed to evaluate whether the absence of coronary calcium could rule out >= 50% coronary stenosis or the need for revascularization. Background The latest American Heart Association guidelines suggest that a calcium score (CS) of zero might exclude the need for coronary angiography among symptomatic patients. Methods A substudy was made of the CORE64 (Coronary Evaluation Using Multi-Detector Spiral Computed Tomography Angiography Using 64 Detectors) multicenter trial comparing the diagnostic performance of 64-detector computed tomography to conventional angiography. Patients clinically referred for conventional angiography were asked to undergo a CS scan up to 30 days before. Results In all, 291 patients were included, of whom 214 (73%) were male, and the mean age was 59.3 +/- 10.0 years. A total of 14 (5%) patients had low, 218 (75%) had intermediate, and 59 (20%) had high pre-test probability of obstructive coronary artery disease. The overall prevalence of >= 50% stenosis was 56%. A total of 72 patients had CS = 0, among whom 14 (19%) had at least 1 >= 50% stenosis. The overall sensitivity for CS = 0 to predict the absence of >= 50% stenosis was 45%, specificity was 91%, negative predictive value was 68%, and positive predictive value was 81%. Additionally, revascularization was performed in 9 (12.5%) CS = 0 patients within 30 days of the CS. From a total of 383 vessels without any coronary calcification, 47 (12%) presented with >= 50% stenosis; and from a total of 64 totally occluded vessels, 13 (20%) had no calcium. Conclusions The absence of coronary calcification does not exclude obstructive stenosis or the need for revascularization among patients with high enough suspicion of coronary artery disease to be referred for coronary angiography, in contrast with the published recommendations. Total coronary occlusion frequently occurs in the absence of any detectable calcification. (Coronary Evaluation Using Multi-Detector Spiral Computed Tomography Angiography Using 64 Detectors [CORE-64]; NCT00738218) (J Am Coll Cardiol 2010;55:627-34) (C) 2010 by the American College of Cardiology Foundation
Coronary CT angiography using 64 detector rows: methods and design of the multi-centre trial CORE-64
Resumo:
Multislice computed tomography (MSCT) for the noninvasive detection of coronary artery stenoses is a promising candidate for widespread clinical application because of its non-invasive nature and high sensitivity and negative predictive value as found in several previous studies using 16 to 64 simultaneous detector rows. A multi-centre study of CT coronary angiography using 16 simultaneous detector rows has shown that 16-slice CT is limited by a high number of nondiagnostic cases and a high false-positive rate. A recent meta-analysis indicated a significant interaction between the size of the study sample and the diagnostic odds ratios suggestive of small study bias, highlighting the importance of evaluating MSCT using 64 simultaneous detector rows in a multi-centre approach with a larger sample size. In this manuscript we detail the objectives and methods of the prospective ""CORE-64"" trial (""Coronary Evaluation Using Multidetector Spiral Computed Tomography Angiography using 64 Detectors""). This multi-centre trial was unique in that it assessed the diagnostic performance of 64-slice CT coronary angiography in nine centres worldwide in comparison to conventional coronary angiography. In conclusion, the multi-centre, multi-institutional and multi-continental trial CORE-64 has great potential to ultimately assess the per-patient diagnostic performance of coronary CT angiography using 64 simultaneous detector rows.
Resumo:
Background: The accuracy of multidetector computed tomographic (CT) angiography involving 64 detectors has not been well established. Methods: We conducted a multicenter study to examine the accuracy of 64-row, 0.5-mm multidetector CT angiography as compared with conventional coronary angiography in patients with suspected coronary artery disease. Nine centers enrolled patients who underwent calcium scoring and multidetector CT angiography before conventional coronary angiography. In 291 patients with calcium scores of 600 or less, segments 1.5 mm or more in diameter were analyzed by means of CT and conventional angiography at independent core laboratories. Stenoses of 50% or more were considered obstructive. The area under the receiver-operating-characteristic curve (AUC) was used to evaluate diagnostic accuracy relative to that of conventional angiography and subsequent revascularization status, whereas disease severity was assessed with the use of the modified Duke Coronary Artery Disease Index. Results: A total of 56% of patients had obstructive coronary artery disease. The patient-based diagnostic accuracy of quantitative CT angiography for detecting or ruling out stenoses of 50% or more according to conventional angiography revealed an AUC of 0.93 (95% confidence interval [CI], 0.90 to 0.96), with a sensitivity of 85% (95% CI, 79 to 90), a specificity of 90% (95% CI, 83 to 94), a positive predictive value of 91% (95% CI, 86 to 95), and a negative predictive value of 83% (95% CI, 75 to 89). CT angiography was similar to conventional angiography in its ability to identify patients who subsequently underwent revascularization: the AUC was 0.84 (95% CI, 0.79 to 0.88) for multidetector CT angiography and 0.82 (95% CI, 0.77 to 0.86) for conventional angiography. A per-vessel analysis of 866 vessels yielded an AUC of 0.91 (95% CI, 0.88 to 0.93). Disease severity ascertained by CT and conventional angiography was well correlated (r=0.81; 95% CI, 0.76 to 0.84). Two patients had important reactions to contrast medium after CT angiography. Conclusions: Multidetector CT angiography accurately identifies the presence and severity of obstructive coronary artery disease and subsequent revascularization in symptomatic patients. The negative and positive predictive values indicate that multidetector CT angiography cannot replace conventional coronary angiography at present. (ClinicalTrials.gov number, NCT00738218.).
Resumo:
Purpose: Several attempts to determine the transit time of a high dose rate (HDR) brachytherapy unit have been reported in the literature with controversial results. The determination of the source speed is necessary to accurately calculate the transient dose in brachytherapy treatments. In these studies, only the average speed of the source was measured as a parameter for transit dose calculation, which does not account for the realistic movement of the source, and is therefore inaccurate for numerical simulations. The purpose of this work is to report the implementation and technical design of an optical fiber based detector to directly measure the instantaneous speed profile of a (192)Ir source in a Nucletron HDR brachytherapy unit. Methods: To accomplish this task, we have developed a setup that uses the Cerenkov light induced in optical fibers as a detection signal for the radiation source moving inside the HDR catheter. As the (192)Ir source travels between two optical fibers with known distance, the threshold of the induced signals are used to extract the transit time and thus the velocity. The high resolution of the detector enables the measurement of the transit time at short separation distance of the fibers, providing the instantaneous speed. Results: Accurate and high resolution speed profiles of the 192Ir radiation source traveling from the safe to the end of the catheter and between dwell positions are presented. The maximum and minimum velocities of the source were found to be 52.0 +/- 1.0 and 17.3 +/- 1:2 cm/s. The authors demonstrate that the radiation source follows a uniformly accelerated linear motion with acceleration of vertical bar a vertical bar = 113 cm/s(2). In addition, the authors compare the average speed measured using the optical fiber detector to those obtained in the literature, showing deviation up to 265%. Conclusions: To the best of the authors` knowledge, the authors directly measured for the first time the instantaneous speed profile of a radiation source in a HDR brachytherapy unit traveling from the unit safe to the end of the catheter and between interdwell distances. The method is feasible and accurate to implement on quality assurance tests and provides a unique database for efficient computational simulations of the transient dose. (C) 2010 American Association of Physicists in Medicine. [DOI: 10.1118/1.3483780]
Resumo:
An analyzer-based X-ray phase-contrast imaging (ABI) setup has been mounted at the Brazilian Synchrotron Light Laboratory (LNLS) for multiple imaging radiography (MIR) purposes. The algorithm employed for treating the MIR data collected at LNLS is described, and its reliability in extracting the distinct types of contrast that can be obtained with MIR is demonstrated by analyzing a test sample (thin polyamide wire). As a practical application, the possibility of studying ophthalmic tissues, corneal sequestra in this case, via MIR is investigated. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
It is not possible to make measurements of the phase of an optical mode using linear optics without introducing an extra phase uncertainty. This extra phase variance is quite large for heterodyne measurements, however it is possible to reduce it to the theoretical limit of log (n) over bar (4 (n) over bar (2)) using adaptive measurements. These measurements are quite sensitive to experimental inaccuracies, especially time delays and inefficient detectors. Here it is shown that the minimum introduced phase variance when there is a time delay of tau is tau/(8 (n) over bar). This result is verified numerically, showing that the phase variance introduced approaches this limit for most of the adaptive schemes using the best final phase estimate. The main exception is the adaptive mark II scheme with simplified feedback, which is extremely sensitive to time delays. The extra phase variance due to time delays is considered for the mark I case with simplified feedback, verifying the tau /2 result obtained by Wiseman and Killip both by a more rigorous analytic technique and numerically.