872 resultados para Signal-to-noise ratio


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The detection of anthropogenic climate change can be improved by recognising the seasonality in the climate change response. This is demonstrated for the North Atlantic jet (zonal wind at 850 hPa, U850) and European precipitation responses projected by the CMIP5 climate models. The U850 future response is characterised by a marked seasonality: an eastward extension of the North Atlantic jet into Europe in November-April, and a poleward shift in May-October. Under the RCP8.5 scenario, the multi-model mean response in U850 in these two extended seasonal means emerges by 2035-2040 for the lower--latitude features and by 2050-2070 for the higher--latitude features, relative to the 1960-1990 climate. This is 5-15 years earlier than when evaluated in the traditional meteorological seasons (December--February, June--August), and it results from an increase in the signal to noise ratio associated with the spatial coherence of the response within the extended seasons. The annual mean response lacks important information on the seasonality of the response without improving the signal to noise ratio. The same two extended seasons are demonstrated to capture the seasonality of the European precipitation response to climate change and to anticipate its emergence by 10-20 years. Furthermore, some of the regional responses, such as the Mediterranean precipitation decline and the U850 response in North Africa in the extended winter, are projected to emerge by 2020-2025, according to the models with a strong response. Therefore, observations might soon be useful to test aspects of the atmospheric circulation response predicted by some of the CMIP5 models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Contamination of the electroencephalogram (EEG) by artifacts greatly reduces the quality of the recorded signals. There is a need for automated artifact removal methods. However, such methods are rarely evaluated against one another via rigorous criteria, with results often presented based upon visual inspection alone. This work presents a comparative study of automatic methods for removing blink, electrocardiographic, and electromyographic artifacts from the EEG. Three methods are considered; wavelet, blind source separation (BSS), and multivariate singular spectrum analysis (MSSA)-based correction. These are applied to data sets containing mixtures of artifacts. Metrics are devised to measure the performance of each method. The BSS method is seen to be the best approach for artifacts of high signal to noise ratio (SNR). By contrast, MSSA performs well at low SNRs but at the expense of a large number of false positive corrections.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we investigate half-duplex two-way dual-hop channel state information (CSI)-assisted amplify-and-forward (AF) relaying in the presence of high-power amplifier (HPA) nonlinearity at relays. The expression for the end-to-end signal-to-noise ratio (SNR) is derived as per the modified system model by taking into account the interference caused by relaying scheme and HPA nonlinearity. The system performance of the considered relaying network is evaluated in terms of average symbol error probability (SEP) in Nakagami-$m$ fading channels, by making use of the moment-generating function (MGF) approach. Numerical results are provided and show the effects of several parameters, such as quadrature amplitude modulation (QAM) order, number of relays, HPA parameters, and Nakagami parameter, on performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sixteen monthly air–sea heat flux products from global ocean/coupled reanalyses are compared over 1993–2009 as part of the Ocean Reanalysis Intercomparison Project (ORA-IP). Objectives include assessing the global heat closure, the consistency of temporal variability, comparison with other flux products, and documenting errors against in situ flux measurements at a number of OceanSITES moorings. The ensemble of 16 ORA-IP flux estimates has a global positive bias over 1993–2009 of 4.2 ± 1.1 W m−2. Residual heat gain (i.e., surface flux + assimilation increments) is reduced to a small positive imbalance (typically, +1–2 W m−2). This compensation between surface fluxes and assimilation increments is concentrated in the upper 100 m. Implied steady meridional heat transports also improve by including assimilation sources, except near the equator. The ensemble spread in surface heat fluxes is dominated by turbulent fluxes (>40 W m−2 over the western boundary currents). The mean seasonal cycle is highly consistent, with variability between products mostly <10 W m−2. The interannual variability has consistent signal-to-noise ratio (~2) throughout the equatorial Pacific, reflecting ENSO variability. Comparisons at tropical buoy sites (10°S–15°N) over 2007–2009 showed too little ocean heat gain (i.e., flux into the ocean) in ORA-IP (up to 1/3 smaller than buoy measurements) primarily due to latent heat flux errors in ORA-IP. Comparisons with the Stratus buoy (20°S, 85°W) over a longer period, 2001–2009, also show the ORA-IP ensemble has 16 W m−2 smaller net heat gain, nearly all of which is due to too much latent cooling caused by differences in surface winds imposed in ORA-IP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A set of four eddy-permitting global ocean reanalyses produced in the framework of the MyOcean project have been compared over the altimetry period 1993–2011. The main differences among the reanalyses used here come from the data assimilation scheme implemented to control the ocean state by inserting reprocessed observations of sea surface temperature (SST), in situ temperature and salinity profiles, sea level anomaly and sea-ice concentration. A first objective of this work includes assessing the interannual variability and trends for a series of parameters, usually considered in the community as essential ocean variables: SST, sea surface salinity, temperature and salinity averaged over meaningful layers of the water column, sea level, transports across pre-defined sections, and sea ice parameters. The eddy-permitting nature of the global reanalyses allows also to estimate eddy kinetic energy. The results show that in general there is a good consistency between the different reanalyses. An intercomparison against experiments without data assimilation was done during the MyOcean project and we conclude that data assimilation is crucial for correctly simulating some quantities such as regional trends of sea level as well as the eddy kinetic energy. A second objective is to show that the ensemble mean of reanalyses can be evaluated as one single system regarding its reliability in reproducing the climate signals, where both variability and uncertainties are assessed through the ensemble spread and signal-to-noise ratio. The main advantage of having access to several reanalyses differing in the way data assimilation is performed is that it becomes possible to assess part of the total uncertainty. Given the fact that we use very similar ocean models and atmospheric forcing, we can conclude that the spread of the ensemble of reanalyses is mainly representative of our ability to gauge uncertainty in the assimilation methods. This uncertainty changes a lot from one ocean parameter to another, especially in global indices. However, despite several caveats in the design of the multi-system ensemble, the main conclusion from this study is that an eddy-permitting multi-system ensemble approach has become mature and our results provide a first step towards a systematic comparison of eddy-permitting global ocean reanalyses aimed at providing robust conclusions on the recent evolution of the oceanic state.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many institutions worldwide have developed ocean reanalyses systems (ORAs) utilizing a variety of ocean models and assimilation techniques. However, the quality of salinity reanalyses arising from the various ORAs has not yet been comprehensively assessed. In this study, we assess the upper ocean salinity content (depth-averaged over 0–700 m) from 14 ORAs and 3 objective ocean analysis systems (OOAs) as part of the Ocean Reanalyses Intercomparison Project. Our results show that the best agreement between estimates of salinity from different ORAs is obtained in the tropical Pacific, likely due to relatively abundant atmospheric and oceanic observations in this region. The largest disagreement in salinity reanalyses is in the Southern Ocean along the Antarctic circumpolar current as a consequence of the sparseness of both atmospheric and oceanic observations in this region. The West Pacific warm pool is the largest region where the signal to noise ratio of reanalysed salinity anomalies is >1. Therefore, the current salinity reanalyses in the tropical Pacific Ocean may be more reliable than those in the Southern Ocean and regions along the western boundary currents. Moreover, we found that the assimilation of salinity in ocean regions with relatively strong ocean fronts is still a common problem as seen in most ORAs. The impact of the Argo data on the salinity reanalyses is visible, especially within the upper 500m, where the interannual variability is large. The increasing trend in global-averaged salinity anomalies can only be found within the top 0–300m layer, but with quite large diversity among different ORAs. Beneath the 300m depth, the global-averaged salinity anomalies from most ORAs switch their trends from a slightly growing trend before 2002 to a decreasing trend after 2002. The rapid switch in the trend is most likely an artefact of the dramatic change in the observing system due to the implementation of Argo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Uncertainty in ocean analysis methods and deficiencies in the observing system are major obstacles for the reliable reconstruction of the past ocean climate. The variety of existing ocean reanalyses is exploited in a multi-reanalysis ensemble to improve the ocean state estimation and to gauge uncertainty levels. The ensemble-based analysis of signal-to-noise ratio allows the identification of ocean characteristics for which the estimation is robust (such as tropical mixed-layer-depth, upper ocean heat content), and where large uncertainty exists (deep ocean, Southern Ocean, sea ice thickness, salinity), providing guidance for future enhancement of the observing and data assimilation systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We estimate the conditions for detectability of two planets in a 2/1 mean-motion resonance from radial velocity data, as a function of their masses, number of observations and the signal-to-noise ratio. Even for a data set of the order of 100 observations and standard deviations of the order of a few meters per second, we find that Jovian-size resonant planets are difficult to detect if the masses of the planets differ by a factor larger than similar to 4. This is consistent with the present population of real exosystems in the 2/1 commensurability, most of which have resonant pairs with similar minimum masses, and could indicate that many other resonant systems exist, but are currently beyond the detectability limit. Furthermore, we analyze the error distribution in masses and orbital elements of orbital fits from synthetic data sets for resonant planets in the 2/1 commensurability. For various mass ratios and number of data points we find that the eccentricity of the outer planet is systematically overestimated, although the inner planet`s eccentricity suffers a much smaller effect. If the initial conditions correspond to small-amplitude oscillations around stable apsidal corotation resonances, the amplitudes estimated from the orbital fits are biased toward larger amplitudes, in accordance to results found in real resonant extrasolar systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We obtained long-slit spectra of high signal-to-noise ratio of the galaxy M32 with the Gemini Multi-Object Spectrograph at the Gemini-North telescope. We analysed the integrated spectra by means of full spectral fitting in order to extract the mixture of stellar populations that best represents its composite nature. Three different galactic radii were analysed, from the nuclear region out to 2 arcmin from the centre. This allows us to compare, for the first time, the results of integrated light spectroscopy with those of resolved colour-magnitude diagrams from the literature. As a main result we propose that an ancient and an intermediate-age population co-exist in M32, and that the balance between these two populations change between the nucleus and outside one effective radius (1r(eff)) in the sense that the contribution from the intermediate population is larger at the nuclear region. We retrieve a smaller signal of a young population at all radii whose origin is unclear and may be a contamination from horizontal branch stars, such as the ones identified by Brown et al. in the nuclear region. We compare our metallicity distribution function for a region 1 to 2 arcmin from the centre to the one obtained with photometric data by Grillmair et al. Both distributions are broad, but our spectroscopically derived distribution has a significant component with [Z/Z(circle dot)] <= -1, which is not found by Grillmair et al.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work the results of a spectroscopic study of the southern field narrow-line Be star HD 171054 are presented. High dispersion and signal-to-noise ratio spectra allowed the estimation of the fundamental photospheric parameters such as the projected rotational velocity, effective temperature and superficial gravity from non-LTE stellar atmosphere models. From these parameters and microturbulence, the abundances of He, C, N, O, Mg, Al and Si for this object are estimated. Results show that C is depleted whereas N is overabundant compared with the sun and OB stars in the solar vicinity. Oxygen and helium are close to the solar value. Magnesium is down by 0.43 dex and aluminium and silicon are overabundant. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To obtain cerebral perfusion territories of the left, the right. and the posterior circulation in humans with high signal-to-noise ratio (SNR) and robust delineation. Materials and Methods: Continuous arterial spin labeling (CASL) was implemented using a dedicated radio frequency (RF) coil. positioned over the neck, to label the major cerebral feeding arteries in humans. Selective labeling was achieved by flow-driven adiabatic fast passage and by tilting the longitudinal labeling gradient about the Y-axis by theta = +/- 60 degrees. Results: Mean cerebral blood flow (CBF) values in gray matter (GM) and white matter (WM) were 74 +/- 13 mL center dot 100 g(-1) center dot minute(-1) and 14 +/- 13 mL center dot 100 g(-1) center dot minute(-1), respectively (N = 14). There were no signal differences between left and right hemispheres when theta = 0 degrees (P > 0.19), indicating efficient labeling of both hemispheres. When theta = +60 degrees, the signal in GM on the left hemisphere, 0.07 +/- 0.06%, was 92% lower than on the right hemisphere. 0.85 +/- 0.30% (P < 1 x 10(-9)). while for theta = -60 degrees, the signal in the right hemisphere. 0.16 +/- 0.13%, was 82% lower than on the contralateral side. 0.89 +/- 0.22% (P < 1 x 10(-10)). Similar attenuations were obtained in WM. Conclusion: Clear delineation of the left and right cerebral perfusion territories was obtained, allowing discrimination of the anterior and posterior circulation in each hemisphere.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In magnetic resonance imaging (MRI), either on human or animal studies, the main requirements for radiofrequency (RF) coils are to produce a homogeneous RF field while used as a transmitter coil and to have the best signal-to-noise ratio (SNR) while used as a receiver. Besides, they need to be easily frequency adjustable and have input impedance matching 50 Omega to several different load conditions. New theoretical and practical concepts are presented here for considerable enhancing of RF coil homogeneity for MRI experiments on small animals. To optimize field homogeneity, we have performed simulations using Blot and Savart law varying the coil`s window angle, achieving the optimum one. However, when the coil`s dimensions are the same order of the wave length and according to transmission line theory, differences in electrical length and effects of mutual inductances between adjacent strip conductors decrease both field homogeneity and SNR. The problematic interactions between strip conductors by means of mutual inductance were eliminated by inserting crossings at half electrical length, avoiding distortion on current density, thus eliminating sources of field inhomogeneity. Experimental results show that measured field maps and simulations are in good agreement. The new coil design, dubbed double-crossed saddle described here have field homogeneity and SNR superior than the linearly driven 8-rung birdcage coil. One of our major findings was that the effects of mutual inductance are more significant than differences in electrical length for this frequency and coil dimensions. In vitro images of a primate Cebus paela brain were acquired, confirming double-crossed saddle superiority. (C) 2010 Wiley Periodicals, Inc. Concepts Magn Reson Part B (Magn Reson Engineering) 37B: 193-201, 2010

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we show that the steady-state free precession sequence can be used to acquire (13)C high-resolution nuclear magnetic resonance spectra and applied to qualitative analysis. The analysis of brucine sample using this sequence with 60 degrees flip angle and time interval between pulses equal to 300 ms (acquisition time, 299.7 ms; recycle delay, 300 ms) resulted in spectrum with twofold enhancement in signal-to-noise ratio, when compared to standard (13)C sequence. This gain was better when a much shorter time interval between pulses (100 ms) was applied. The result obtained was more than fivefold enhancement in signal-to-noise ratio, equivalent to more than 20-fold reduction in total data recording time. However, this short time interval between pulses produces a spectrum with severe phase and truncation anomalies. We demonstrated that these anomalies can be minimized by applying an appropriate apodization function and plotting the spectrum in the magnitude mode.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid development of data transfer through internet made it easier to send the data accurate and faster to the destination. There are many transmission media to transfer the data to destination like e-mails; at the same time it is may be easier to modify and misuse the valuable information through hacking. So, in order to transfer the data securely to the destination without any modifications, there are many approaches like cryptography and steganography. This paper deals with the image steganography as well as with the different security issues, general overview of cryptography, steganography and digital watermarking approaches.  The problem of copyright violation of multimedia data has increased due to the enormous growth of computer networks that provides fast and error free transmission of any unauthorized duplicate and possibly manipulated copy of multimedia information. In order to be effective for copyright protection, digital watermark must be robust which are difficult to remove from the object in which they are embedded despite a variety of possible attacks. The message to be send safe and secure, we use watermarking. We use invisible watermarking to embed the message using LSB (Least Significant Bit) steganographic technique. The standard LSB technique embed the message in every pixel, but my contribution for this proposed watermarking, works with the hint for embedding the message only on the image edges alone. If the hacker knows that the system uses LSB technique also, it cannot decrypt correct message. To make my system robust and secure, we added cryptography algorithm as Vigenere square. Whereas the message is transmitted in cipher text and its added advantage to the proposed system. The standard Vigenere square algorithm works with either lower case or upper case. The proposed cryptography algorithm is Vigenere square with extension of numbers also. We can keep the crypto key with combination of characters and numbers. So by using these modifications and updating in this existing algorithm and combination of cryptography and steganography method we develop a secure and strong watermarking method. Performance of this watermarking scheme has been analyzed by evaluating the robustness of the algorithm with PSNR (Peak Signal to Noise Ratio) and MSE (Mean Square Error) against the quality of the image for large amount of data. While coming to see results of the proposed encryption, higher value of 89dB of PSNR with small value of MSE is 0.0017. Then it seems the proposed watermarking system is secure and robust for hiding secure information in any digital system, because this system collect the properties of both steganography and cryptography sciences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This letter addresses the problem of the design of a precoder for multiple transmit antenna communication systems with spatially and temporally correlated fading channels. By using the asymptotic (high signal-to-noise ratio) mean-square error of the channel estimates, the letter derives a precoder for unitary space-time codes that can exploit the spatiotemporal correlation in the time-varying fading channels. Simulation results illustrate that significant performance gains can be achieved by using the new precoder.