917 resultados para Gaussian prior variance
Resumo:
Au commencement, feuillet d'un registre des enquêteurs de saint Louis.
Resumo:
Because of the various matrices available for forensic investigations, the development of versatile analytical approaches allowing the simultaneous determination of drugs is challenging. The aim of this work was to assess a liquid chromatography-tandem mass spectrometry (LC-MS/MS) platform allowing the rapid quantification of colchicine in body fluids and tissues collected in the context of a fatal overdose. For this purpose, filter paper was used as a sampling support and was associated with an automated 96-well plate extraction performed by the LC autosampler itself. The developed method features a 7-min total run time including automated filter paper extraction (2 min) and chromatographic separation (5 min). The sample preparation was reduced to a minimum regardless of the matrix analyzed. This platform was fully validated for dried blood spots (DBS) in the toxic concentration range of colchicine. The DBS calibration curve was applied successfully to quantification in all other matrices (body fluids and tissues) except for bile, where an excessive matrix effect was found. The distribution of colchicine for a fatal overdose case was reported as follows: peripheral blood, 29 ng/ml; urine, 94 ng/ml; vitreous humour and cerebrospinal fluid, < 5 ng/ml; pericardial fluid, 14 ng/ml; brain, < 5 pg/mg; heart, 121 pg/mg; kidney, 245 pg/mg; and liver, 143 pg/mg. Although filter paper is usually employed for DBS, we report here the extension of this alternative sampling support to the analysis of other body fluids and tissues. The developed platform represents a rapid and versatile approach for drug determination in multiple forensic media.
Resumo:
The mutual information of independent parallel Gaussian-noise channels is maximized, under an average power constraint, by independent Gaussian inputs whose power is allocated according to the waterfilling policy. In practice, discrete signalling constellations with limited peak-to-average ratios (m-PSK, m-QAM, etc) are used in lieu of the ideal Gaussian signals. This paper gives the power allocation policy that maximizes the mutual information over parallel channels with arbitrary input distributions. Such policy admits a graphical interpretation, referred to as mercury/waterfilling, which generalizes the waterfilling solution and allows retaining some of its intuition. The relationship between mutual information of Gaussian channels and nonlinear minimum mean-square error proves key to solving the power allocation problem.
Resumo:
We present a method to compute, quickly and efficiently, the mutual information achieved by an IID (independent identically distributed) complex Gaussian signal on a block Rayleigh-faded channel without side information at the receiver. The method accommodates both scalar and MIMO (multiple-input multiple-output) settings. Operationally, this mutual information represents the highest spectral efficiency that can be attained using Gaussiancodebooks. Examples are provided that illustrate the loss in spectral efficiency caused by fast fading and how that loss is amplified when multiple transmit antennas are used. These examples are further enriched by comparisons with the channel capacity under perfect channel-state information at the receiver, and with the spectral efficiency attained by pilot-based transmission.
Resumo:
We present a method to compute, quickly and efficiently, the mutual information achieved by an IID (independent identically distributed) complex Gaussian signal on a block Rayleigh-faded channel without side information at the receiver. The method accommodates both scalar and MIMO (multiple-input multiple-output) settings. Operationally, this mutual information represents the highest spectral efficiency that can be attained using Gaussiancodebooks. Examples are provided that illustrate the loss in spectral efficiency caused by fast fading and how that loss is amplified when multiple transmit antennas are used. These examples are further enriched by comparisons with the channel capacity under perfect channel-state information at the receiver, and with the spectral efficiency attained by pilot-based transmission.
Resumo:
This paper studies the fundamental operational limits of a class of Gaussian multicast channels with an interference setting. In particular, the paper considers two base stations multicasting separate messages to distinct sets of users. In the presence of channel state information at the transmitters and at the respective receivers, the capacity region of the Gaussian multicast channel with interference is characterized to within one bit. At the crux of this result is an extension to the multicast channel with interference of the Han-Kobayashi or the Chong-Motani-Garg achievable region for the interference channel.
Resumo:
Prior probabilities represent a core element of the Bayesian probabilistic approach to relatedness testing. This letter opinions on the commentary 'Use of prior odds for missing persons identifications' by Budowle et al. (2011), published recently in this journal. Contrary to Budowle et al. (2011), we argue that the concept of prior probabilities (i) is not endowed with the notion of objectivity, (ii) is not a case for computation and (iii) does not require new guidelines edited by the forensic DNA community - as long as probability is properly considered as an expression of personal belief. Please see related article: http://www.investigativegenetics.com/content/3/1/3
Resumo:
The vast territories that have been radioactively contaminated during the 1986 Chernobyl accident provide a substantial data set of radioactive monitoring data, which can be used for the verification and testing of the different spatial estimation (prediction) methods involved in risk assessment studies. Using the Chernobyl data set for such a purpose is motivated by its heterogeneous spatial structure (the data are characterized by large-scale correlations, short-scale variability, spotty features, etc.). The present work is concerned with the application of the Bayesian Maximum Entropy (BME) method to estimate the extent and the magnitude of the radioactive soil contamination by 137Cs due to the Chernobyl fallout. The powerful BME method allows rigorous incorporation of a wide variety of knowledge bases into the spatial estimation procedure leading to informative contamination maps. Exact measurements (?hard? data) are combined with secondary information on local uncertainties (treated as ?soft? data) to generate science-based uncertainty assessment of soil contamination estimates at unsampled locations. BME describes uncertainty in terms of the posterior probability distributions generated across space, whereas no assumption about the underlying distribution is made and non-linear estimators are automatically incorporated. Traditional estimation variances based on the assumption of an underlying Gaussian distribution (analogous, e.g., to the kriging variance) can be derived as a special case of the BME uncertainty analysis. The BME estimates obtained using hard and soft data are compared with the BME estimates obtained using only hard data. The comparison involves both the accuracy of the estimation maps using the exact data and the assessment of the associated uncertainty using repeated measurements. Furthermore, a comparison of the spatial estimation accuracy obtained by the two methods was carried out using a validation data set of hard data. Finally, a separate uncertainty analysis was conducted that evaluated the ability of the posterior probabilities to reproduce the distribution of the raw repeated measurements available in certain populated sites. The analysis provides an illustration of the improvement in mapping accuracy obtained by adding soft data to the existing hard data and, in general, demonstrates that the BME method performs well both in terms of estimation accuracy as well as in terms estimation error assessment, which are both useful features for the Chernobyl fallout study.
Resumo:
The potential of type-2 fuzzy sets for managing high levels of uncertainty in the subjective knowledge of experts or of numerical information has focused on control and pattern classification systems in recent years. One of the main challenges in designing a type-2 fuzzy logic system is how to estimate the parameters of type-2 fuzzy membership function (T2MF) and the Footprint of Uncertainty (FOU) from imperfect and noisy datasets. This paper presents an automatic approach for learning and tuning Gaussian interval type-2 membership functions (IT2MFs) with application to multi-dimensional pattern classification problems. T2MFs and their FOUs are tuned according to the uncertainties in the training dataset by a combination of genetic algorithm (GA) and crossvalidation techniques. In our GA-based approach, the structure of the chromosome has fewer genes than other GA methods and chromosome initialization is more precise. The proposed approach addresses the application of the interval type-2 fuzzy logic system (IT2FLS) for the problem of nodule classification in a lung Computer Aided Detection (CAD) system. The designed IT2FLS is compared with its type-1 fuzzy logic system (T1FLS) counterpart. The results demonstrate that the IT2FLS outperforms the T1FLS by more than 30% in terms of classification accuracy.
Resumo:
BACKGROUND: We investigated clinical predictors of appropriate prophylaxis prior to the onset of venous thromboembolism (VTE). METHODS: In 14 Swiss hospitals, 567 consecutive patients (306 medical, 261 surgical) with acute VTE and hospitalization < 30 days prior to the VTE event were enrolled. RESULTS: Prophylaxis was used in 329 (58%) patients within 30 days prior to the VTE event. Among the medical patients, 146 (48%) received prophylaxis, and among the surgical patients, 183 (70%) received prophylaxis (P < 0.001). The indication for prophylaxis was present in 262 (86%) medical patients and in 217 (83%) surgical patients. Among the patients with an indication for prophylaxis, 135 (52%) of the medical patients and 165 (76%) of the surgical patients received prophylaxis (P < 0.001). Admission to the intensive care unit [odds ratio (OR) 3.28, 95% confidence interval (CI) 1.94-5.57], recent surgery (OR 2.28, 95% CI 1.51-3.44), bed rest > 3 days (OR 2.12, 95% CI 1.45-3.09), obesity (OR 2.01, 95% CI 1.03-3.90), prior deep vein thrombosis (OR 1.71, 95% CI 1.31-2.24) and prior pulmonary embolism (OR 1.54, 95% CI 1.05-2.26) were independent predictors of prophylaxis. In contrast, cancer (OR 1.06, 95% CI 0.89-1.25), age (OR 0.99, 95% CI 0.98-1.01), acute heart failure (OR 1.13, 95% CI 0.79-1.63) and acute respiratory failure (OR 1.19, 95% CI 0.89-1.59) were not predictive of prophylaxis. CONCLUSIONS: Although an indication for prophylaxis was present in most patients who suffered acute VTE, almost half did not receive any form of prophylaxis. Future efforts should focus on the improvement of prophylaxis for hospitalized patients, particularly in patients with cancer, acute heart or respiratory failure, and in the elderly.
Spanning tests in return and stochastic discount factor mean-variance frontiers: A unifying approach
Resumo:
We propose new spanning tests that assess if the initial and additional assets share theeconomically meaningful cost and mean representing portfolios. We prove their asymptoticequivalence to existing tests under local alternatives. We also show that unlike two-step oriterated procedures, single-step methods such as continuously updated GMM yield numericallyidentical overidentifyng restrictions tests, so there is arguably a single spanning test.To prove these results, we extend optimal GMM inference to deal with singularities in thelong run second moment matrix of the influence functions. Finally, we test for spanningusing size and book-to-market sorted US stock portfolios.