97 resultados para Probability of choice


Relevância:

90.00% 90.00%

Publicador:

Resumo:

A standard CDMA system is considered and an extension of Pearson's results is used to determine the density function of the interference. The method is shown to work well in some cases, but not so in others. However this approach can be useful in further determining the probability of error of the system with minimal computational requirements.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The problem of calculating the probability of error in a DS/SSMA system has been extensively studied for more than two decades. When random sequences are employed some conditioning must be done before the application of the central limit theorem is attempted, leading to a Gaussian distribution. The authors seek to characterise the multiple access interference as a random-walk with a random number of steps, for random and deterministic sequences. Using results from random-walk theory, they model the interference as a K-distributed random variable and use it to calculate the probability of error in the form of a series, for a DS/SSMA system with a coherent correlation receiver and BPSK modulation under Gaussian noise. The asymptotic properties of the proposed distribution agree with other analyses. This is, to the best of the authors' knowledge, the first attempt to propose a non-Gaussian distribution for the interference. The modelling can be extended to consider multipath fading and general modulation

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Salmonella is the second most commonly reported human foodborne pathogen in England and Wales, and antimicrobial-resistant strains of Salmonella are an increasing problem in both human and veterinary medicine. In this work we used a generalized linear spatial model to estimate the spatial and temporal patterns of antimicrobial resistance in Salmonella Typhimurium in England and Wales. Of the antimicrobials considered we found a common peak in the probability that an S. Typhimurium incident will show resistance to a given antimicrobial in late spring and in mid to late autumn; however, for one of the antimicrobials (streptomycin) there was a sharp drop, over the last 18 months of the period of investigation, in the probability of resistance. We also found a higher probability of resistance in North Wales which is consistent across the antimicrobials considered. This information contributes to our understanding of the epidemiology of antimicrobial resistance in Salmonella.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aircraft flying through cold ice-supersaturated air produce persistent contrails which contribute to the climate impact of aviation. Here, we demonstrate the importance of the weather situation, together with the route and altitude of the aircraft through this, on estimating contrail coverage. The results have implications for determining the climate impact of contrails as well as potential mitigation strategies. Twenty-one years of re-analysis data are used to produce a climatological assessment of conditions favorable for persistent contrail formation between 200 and 300 hPa over the north Atlantic in winter. The seasonal-mean frequency of cold ice-supersaturated regions is highest near 300 hPa, and decreases with altitude. The frequency of occurrence of ice-supersaturated regions varies with large-scale weather pattern; the most common locations are over Greenland, on the southern side of the jet stream and around the northern edge of high pressure ridges. Assuming aircraft take a great circle route, as opposed to a more realistic time-optimal route, is likely to lead to an error in the estimated contrail coverage, which can exceed 50% for westbound north Atlantic flights. The probability of contrail formation can increase or decrease with height, depending on the weather pattern, indicating that the generic suggestion that flying higher leads to fewer contrails is not robust.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The probability of a quantum particle being detected in a given solid angle is determined by the S-matrix. The explanation of this fact in time-dependent scattering theory is often linked to the quantum flux, since the quantum flux integrated against a (detector-) surface and over a time interval can be viewed as the probability that the particle crosses this surface within the given time interval. Regarding many particle scattering, however, this argument is no longer valid, as each particle arrives at the detector at its own random time. While various treatments of this problem can be envisaged, here we present a straightforward Bohmian analysis of many particle potential scattering from which the S-matrix probability emerges in the limit of large distances.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Reliability analysis of probabilistic forecasts, in particular through the rank histogram or Talagrand diagram, is revisited. Two shortcomings are pointed out: Firstly, a uniform rank histogram is but a necessary condition for reliability. Secondly, if the forecast is assumed to be reliable, an indication is needed how far a histogram is expected to deviate from uniformity merely due to randomness. Concerning the first shortcoming, it is suggested that forecasts be grouped or stratified along suitable criteria, and that reliability is analyzed individually for each forecast stratum. A reliable forecast should have uniform histograms for all individual forecast strata, not only for all forecasts as a whole. As to the second shortcoming, instead of the observed frequencies, the probability of the observed frequency is plotted, providing and indication of the likelihood of the result under the hypothesis that the forecast is reliable. Furthermore, a Goodness-Of-Fit statistic is discussed which is essentially the reliability term of the Ignorance score. The discussed tools are applied to medium range forecasts for 2 m-temperature anomalies at several locations and lead times. The forecasts are stratified along the expected ranked probability score. Those forecasts which feature a high expected score turn out to be particularly unreliable.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The removal of the most long-lived radiotoxic elements from used nuclear fuel, minor actinides, is foreseen as an essential step toward increasing the public acceptance of nuclear energy as a key component of a low-carbon energy future. Once removed from the remaining used fuel, these elements can be used as fuel in their own right in fast reactors or converted into shorter-lived or stable elements by transmutation prior to geological disposal. The SANEX process is proposed to carry out this selective separation by solvent extraction. Recent efforts to develop reagents capable of separating the radioactive minor actinides from lanthanides as part of a future strategy for the management and reprocessing of used nuclear fuel are reviewed. The current strategies for the reprocessing of PUREX raffinate are summarized, and some guiding principles for the design of actinide-selective reagents are defined. The development and testing of different classes of solvent extraction reagent are then summarized, covering some of the earliest ligand designs right through to the current reagents of choice, bis(1,2,4-triazine) ligands. Finally, we summarize research aimed at developing a fundamental understanding of the underlying reasons for the excellent extraction capabilities and high actinide/lanthanide selectivities shown by this class of ligands and our recent efforts to immobilize these reagents onto solid phases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aims: Quinolone antibiotics are the agents of choice for treating systemic Salmonella infections. Resistance to quinolones is usually mediated by mutations in the DNA gyrase gene gyrA. Here we report the evaluation of standard HPLC equipment for the detection of mutations (single nucleotide polymorphisms; SNPs) in gyrA, gyrB, parC and parE by denaturing high performance liquid chromatography (DHPLC). Methods: A panel of Salmonella strains was assembled which comprised those with known different mutations in gyrA (n = 8) and fluoroquinolone-susceptible and -resistant strains (n = 50) that had not been tested for mutations in gyrA. Additionally, antibiotic-susceptible strains of serotypes other than Salmonella enterica serovar Typhimurium strains were examined for serotype-specific mutations in gyrB (n = 4), parC (n = 6) and parE (n = 1). Wild-type (WT) control DNA was prepared from Salmonella Typhimurium NCTC 74. The DNA of respective strains was amplified by PCR using Optimase (R) proofreading DNA polymerase. Duplex DNA samples were analysed using an Agilent A1100 HPLC system with a Varian Helix (TM) DNA column. Sequencing was used to validate mutations detected by DHPLC in the strains with unknown mutations. Results: Using this HPLC system, mutations in gyrA, gyrB, parC and parE were readily detected by comparison with control chromatograms. Sequencing confirmed the gyrA predicted mutations as detected by DHPLC in the unknown strains and also confirmed serotype-associated sequence changes in non-Typhimurium serotypes. Conclusions: The results demonstrated that a non-specialist standard HPLC machine fitted with a generally available column can be used to detect SNPs in gyrA, gyrB, parC and parE genes by DHPLC. Wider applications should be possible.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study considers the role that reserve prices may play in residential property auctions. In comparison to much of the previous empirical work, this study has access to undisclosed reserve prices from English auctions. Consistent with theoretical arguments in the auction literature, the results obtained illustrate that whilst higher reserve prices increase the revenue obtained for the seller, they also reduce the probability of sale. The findings also highlight the importance of auction participation, with the number of individual bidders and the number of bids significant in most specifications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The two-way relationship between Rossby Wave-Breaking (RWB) and intensification of extra tropical cyclones is analysed over the Euro-Atlantic sector. In particular, the timing, intensity and location of cyclone development are related to RWB occurrences. For this purpose, two potential-temperature based indices are used to detect and classify anticyclonic and cyclonic RWB episodes from ERA-40 Re-Analysis data. Results show that explosive cyclogenesis over the North Atlantic (NA) is fostered by enhanced occurrence of RWB on days prior to the cyclone’s maximum intensification. Under such conditions, the eddy-driven jet stream is accelerated over the NA, thus enhancing conditions for cyclogenesis. For explosive cyclogenesis over the eastern NA, enhanced cyclonic RWB over eastern Greenland and anticyclonic RWB over the sub-tropical NA are observed. Typically only one of these is present in any given case, with the RWB over eastern Greenland being more frequent than its southern counterpart. This leads to an intensification of the jet over the eastern NA and enhanced probability of windstorms reaching Western Europe. Explosive cyclones evolving under simultaneous RWB on both sides of the jet feature a higher mean intensity and deepening rates than cyclones preceded by a single RWB event. Explosive developments over the western NA are typically linked to a single area of enhanced cyclonic RWB over western Greenland. Here, the eddy-driven jet is accelerated over the western NA. Enhanced occurrence of cyclonic RWB over southern Greenland and anticyclonic RWB over Europe is also observed after explosive cyclogenesis, potentially leading to the onset of Scandinavian Blocking. However, only very intense developments have a considerable influence on the large-scale atmospheric flow. Non-explosive cyclones depict no sign of enhanced RWB over the whole NA area. We conclude that the links between RWB and cyclogenesis over the Euro-Atlantic sector are sensitive to the cyclone’s maximum intensity, deepening rate and location.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we investigate half-duplex two-way dual-hop channel state information (CSI)-assisted amplify-and-forward (AF) relaying in the presence of in-phase and quadrature-phase (I/Q) imbalance. A compensation approach for the I/Q imbalance is proposed, which employs the received signals together with their conjugations to detect the desired signal. We also derive the average symbol error probability of the considered half-duplex two-way dual-hop CSI-assisted AF relaying networks with and without compensation for I/Q imbalance in Rayleigh fading channels. Numerical results are provided and show that the proposed compensation method mitigates the impact of I/Q imbalance to a certain extent.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We propose and demonstrate a fully probabilistic (Bayesian) approach to the detection of cloudy pixels in thermal infrared (TIR) imagery observed from satellite over oceans. Using this approach, we show how to exploit the prior information and the fast forward modelling capability that are typically available in the operational context to obtain improved cloud detection. The probability of clear sky for each pixel is estimated by applying Bayes' theorem, and we describe how to apply Bayes' theorem to this problem in general terms. Joint probability density functions (PDFs) of the observations in the TIR channels are needed; the PDFs for clear conditions are calculable from forward modelling and those for cloudy conditions have been obtained empirically. Using analysis fields from numerical weather prediction as prior information, we apply the approach to imagery representative of imagers on polar-orbiting platforms. In comparison with the established cloud-screening scheme, the new technique decreases both the rate of failure to detect cloud contamination and the false-alarm rate by one quarter. The rate of occurrence of cloud-screening-related errors of >1 K in area-averaged SSTs is reduced by 83%. Copyright © 2005 Royal Meteorological Society.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Methods for recombinant production of eukaryotic membrane proteins, yielding sufficient quantity and quality of protein for structural biology, remain a challenge. We describe here, expression and purification optimisation of the human SERCA2a cardiac isoform of Ca2+ translocating ATPase, using Saccharomyces cerevisiae as the heterologous expression system of choice. Two different expression vectors were utilised, allowing expression of C-terminal fusion proteins with a biotinylation domain or a GFP- His8 tag. Solubilised membrane fractions containing the protein of interest were purified onto Streptavidin-Sepharose, Ni-NTA or Talon resin, depending on the fusion tag present. Biotinylated protein was detected using specific antibody directed against SERCA2 and, advantageously, GFP-His8 fusion protein was easily traced during the purification steps using in-gel fluorescence. Importantly, talon resin affinity purification proved more specific than Ni-NTA resin for the GFP-His8 tagged protein, providing better separation of oligomers present, during size exclusion chromatography. The optimised method for expression and purification of human cardiac SERCA2a reported herein, yields purified protein (> 90%) that displays a calcium-dependent thapsigargin-sensitive activity and is suitable for further biophysical, structural and physiological studies. This work provides support for the use of Saccharomyces cerevisiae as a suitable expression system for recombinant production of multi-domain eukaryotic membrane proteins.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Choices not only reflect our preference, but they also affect our behavior. The phenomenon of choice-induced preference change has been of interest to cognitive dissonance researchers in social psychology, and more recently, it has attracted the attention of researchers in economics and neuroscience. Preference modulation after the mere act of making a choice has been repeatedly demonstrated over the last 50 years by an experimental paradigm called the “free-choice paradigm.” However, Chen and Risen (2010) pointed out a serious methodological flaw in this paradigm, arguing that evidence for choice-induced preference change is still insufficient. Despite the flaw, studies using the traditional free-choice paradigm continue to be published without addressing the criticism. Here, aiming to draw more attention to this issue, we briefly explain the methodological problem, and then describe simple simulation studies that illustrate how the free-choice paradigm produces a systematic pattern of preference change consistent with cognitive dissonance, even without any change in true preference. Our stimulation also shows how a different level of noise in each phase of the free-choice paradigm independently contributes to the magnitude of artificial preference change. Furthermore, we review ways of addressing the critique and provide a meta-analysis to show the effect size of choice-induced preference change after addressing the critique. Finally, we review and discuss, based on the results of the stimulation studies, how the criticism affects our interpretation of past findings generated from the free-choice paradigm. We conclude that the use of the conventional free-choice paradigm should be avoided in future research and the validity of past findings from studies using this paradigm should be empirically re-established. (PsycINFO Database Record (c) 2013 APA, all rights reserved)(journal abstract)