71 resultados para Error Probability


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Growth codes are a subclass of Rateless codes that have found interesting applications in data dissemination problems. Compared to other Rateless and conventional channel codes, Growth codes show improved intermediate performance which is particularly useful in applications where partial data presents some utility. In this paper, we investigate the asymptotic performance of Growth codes using the Wormald method, which was proposed for studying the Peeling Decoder of LDPC and LDGM codes. Compared to previous works, the Wormald differential equations are set on nodes' perspective which enables a numerical solution to the computation of the expected asymptotic decoding performance of Growth codes. Our framework is appropriate for any class of Rateless codes that does not include a precoding step. We further study the performance of Growth codes with moderate and large size codeblocks through simulations and we use the generalized logistic function to model the decoding probability. We then exploit the decoding probability model in an illustrative application of Growth codes to error resilient video transmission. The video transmission problem is cast as a joint source and channel rate allocation problem that is shown to be convex with respect to the channel rate. This illustrative application permits to highlight the main advantage of Growth codes, namely improved performance in the intermediate loss region.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Let {μ(i)t}t≥0 ( i=1,2 ) be continuous convolution semigroups (c.c.s.) of probability measures on Aff(1) (the affine group on the real line). Suppose that μ(1)1=μ(2)1 . Assume furthermore that {μ(1)t}t≥0 is a Gaussian c.c.s. (in the sense that its generating distribution is a sum of a primitive distribution and a second-order differential operator). Then μ(1)t=μ(2)t for all t≥0 . We end up with a possible application in mathematical finance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RATIONALE In biomedical journals authors sometimes use the standard error of the mean (SEM) for data description, which has been called inappropriate or incorrect. OBJECTIVE To assess the frequency of incorrect use of SEM in articles in three selected cardiovascular journals. METHODS AND RESULTS All original journal articles published in 2012 in Cardiovascular Research, Circulation: Heart Failure and Circulation Research were assessed by two assessors for inappropriate use of SEM when providing descriptive information of empirical data. We also assessed whether the authors state in the methods section that the SEM will be used for data description. Of 441 articles included in this survey, 64% (282 articles) contained at least one instance of incorrect use of the SEM, with two journals having a prevalence above 70% and "Circulation: Heart Failure" having the lowest value (27%). In 81% of articles with incorrect use of SEM, the authors had explicitly stated that they use the SEM for data description and in 89% SEM bars were also used instead of 95% confidence intervals. Basic science studies had a 7.4-fold higher level of inappropriate SEM use (74%) than clinical studies (10%). LIMITATIONS The selection of the three cardiovascular journals was based on a subjective initial impression of observing inappropriate SEM use. The observed results are not representative for all cardiovascular journals. CONCLUSION In three selected cardiovascular journals we found a high level of inappropriate SEM use and explicit methods statements to use it for data description, especially in basic science studies. To improve on this situation, these and other journals should provide clear instructions to authors on how to report descriptive information of empirical data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Medical errors, in particular those resulting in harm, pose a serious situation for patients ("first victims") and the healthcare workers involved ("second victims") and can have long-lasting and distressing consequences. To prevent a second traumatization, appropriate and empathic interaction with all persons involved is essential besides error analysis. Patients share a nearly universal, broad preference for a complete disclosure of incidents, regardless of age, gender, or education. This includes the personal, timely and unambiguous disclosure of the adverse event, information relating to the event, its causes and consequences, and an apology and sincere expression of regret. While the majority of healthcare professionals generally support and honest and open disclosure of adverse events, they also face various barriers which impede the disclosure (e.g., fear of legal consequences). Despite its essential importance, disclosure of adverse events in practice occurs in ways that are rarely acceptable to patients and their families. The staff involved often experiences acute distress and an intense emotional response to the event, which may become chronic and increase the risk of depression, burnout and post-traumatic stress disorders. Communication with peers is vital for people to be able to cope constructively and protectively with harmful errors. Survey studies among healthcare workers show, however, that they often do not receive sufficient individual and institutional support. Healthcare organizations should prepare for medical errors and harmful events and implement a communication plan and a support system that covers the requirements and different needs of patients and the staff involved.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE We prospectively assessed the diagnostic accuracy of diffusion-weighted magnetic resonance imaging for detecting significant prostate cancer. MATERIALS AND METHODS We performed a prospective study of 111 consecutive men with prostate and/or bladder cancer who underwent 3 Tesla diffusion-weighted magnetic resonance imaging of the pelvis without an endorectal coil before radical prostatectomy (78) or cystoprostatectomy (33). Three independent readers blinded to clinical and pathological data assigned a prostate cancer suspicion grade based on qualitative imaging analysis. Final pathology results of prostates with and without cancer served as the reference standard. Primary outcomes were the sensitivity and specificity of diffusion-weighted magnetic resonance imaging for detecting significant prostate cancer with significance defined as a largest diameter of the index lesion of 1 cm or greater, extraprostatic extension, or Gleason score 7 or greater on final pathology assessment. Secondary outcomes were interreader agreement assessed by the Fleiss κ coefficient and image reading time. RESULTS Of the 111 patients 93 had prostate cancer, which was significant in 80 and insignificant in 13, and 18 had no prostate cancer on final pathology results. The sensitivity and specificity of diffusion-weighted magnetic resonance imaging for detecting significant PCa was 89% to 91% and 77% to 81%, respectively, for the 3 readers. Interreader agreement was good (Fleiss κ 0.65 to 0.74). Median reading time was between 13 and 18 minutes. CONCLUSIONS Diffusion-weighted magnetic resonance imaging (3 Tesla) is a noninvasive technique that allows for the detection of significant prostate cancer with high probability without contrast medium or an endorectal coil, and with good interreader agreement and a short reading time. This technique should be further evaluated as a tool to stratify patients with prostate cancer for individualized treatment options.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Voting power is commonly measured using a probability. But what kind of probability is this? Is it a degree of belief or an objective chance or some other sort of probability? The aim of this paper is to answer this question. The answer depends on the use to which a measure of voting power is put. Some objectivist interpretations of probabilities are appropriate when we employ such a measure for descriptive purposes. By contrast, when voting power is used to normatively assess voting rules, the probabilities are best understood as classical probabilities, which count possibilities. This is so because, from a normative stance, voting power is most plausibly taken to concern rights and thus possibilities. The classical interpretation also underwrites the use of the Bernoulli model upon which the Penrose/Banzhaf measure is based.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article, the Society for Personality and Social Psychology (SPSP) Task Force on Publication and Research Practices offers a brief statistical primer and recommendations for improving the dependability of research. Recommendations for research practice include (a) describing and addressing the choice of N (sample size) and consequent issues of statistical power, (b) reporting effect sizes and 95% confidence intervals (CIs), (c) avoiding “questionable research practices” that can inflate the probability of Type I error, (d) making available research materials necessary to replicate reported results, (e) adhering to SPSP’s data sharing policy, (f) encouraging publication of high-quality replication studies, and (g) maintaining flexibility and openness to alternative standards and methods. Recommendations for educational practice include (a) encouraging a culture of “getting it right,” (b) teaching and encouraging transparency of data reporting, (c) improving methodological instruction, and (d) modeling sound science and supporting junior researchers who seek to “get it right.”