960 resultados para Probability Metrics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, the authors propose simple methods to evaluate the achievable rates and outage probability of a cognitive radio (CR) link that takes into account the imperfectness of spectrum sensing. In the considered system, the CR transmitter and receiver correlatively sense and dynamically exploit the spectrum pool via dynamic frequency hopping. Under imperfect spectrum sensing, false-alarm and miss-detection occur which cause impulsive interference emerged from collisions due to the simultaneous spectrum access of primary and cognitive users. That makes it very challenging to evaluate the achievable rates. By first examining the static link where the channel is assumed to be constant over time, they show that the achievable rate using a Gaussian input can be calculated accurately through a simple series representation. In the second part of this study, they extend the calculation of the achievable rate to wireless fading environments. To take into account the effect of fading, they introduce a piece-wise linear curve fitting-based method to approximate the instantaneous achievable rate curve as a combination of linear segments. It is then demonstrated that the ergodic achievable rate in fast fading and the outage probability in slow fading can be calculated to achieve any given accuracy level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of custom classification techniques and posterior probability modeling (PPM) using Worldview-2 multispectral imagery to archaeological field survey is presented in this paper. Research is focused on the identification of Neolithic felsite stone tool workshops in the North Mavine region of the Shetland Islands in Northern Scotland. Sample data from known workshops surveyed using differential GPS are used alongside known non-sites to train a linear discriminant analysis (LDA) classifier based on a combination of datasets including Worldview-2 bands, band difference ratios (BDR) and topographical derivatives. Principal components analysis is further used to test and reduce dimensionality caused by redundant datasets. Probability models were generated by LDA using principal components and tested with sites identified through geological field survey. Testing shows the prospective ability of this technique and significance between 0.05 and 0.01, and gain statistics between 0.90 and 0.94, higher than those obtained using maximum likelihood and random forest classifiers. Results suggest that this approach is best suited to relatively homogenous site types, and performs better with correlated data sources. Finally, by combining posterior probability models and least-cost analysis, a survey least-cost efficacy model is generated showing the utility of such approaches to archaeological field survey.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The second theme of this book concerns L&D’s ‘Contributions’, specifically how L&D professionals articulate, communicate and demonstrate value that it brings to the organization. Specifically, Chapter 3, titled ‘Using information, metrics and developing business cases for L&D’, discusses how L&D professionals can do this using the business case as a vehicle. The business case is a tool that L&D professionals can use to show how new L&D initiatives can benefit the organization and its stakeholders. The value of such benefit can be ‘articulated’ quantitatively and qualitatively. Chapter 3 adopts a holistic approach in developing a business case. L&D professionals must be competently knowledgeable about accounting and finance but without the need to be experts – as their expertise lies in L&D. Therefore to successfully complete a business case, L&D professionals need to form teams comprising the right members (depending on what the business case is about). The political realities that are associated with the development of a business case can be important considerations. How well L&D is able to ‘sell’ a business case depends on how well it is framed, usually either as a problem or opportunity. We then discuss the information, data and metrics required to build a typical business case, specifically in terms of identifying the benefits and costs. The chapter concludes with some suggestions on how the findings from the business case can be presented in infographics-inspired form.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-07

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem addressed concerns the determination of the average numberof successive attempts of guessing a word of a certain length consisting of letters withgiven probabilities of occurrence. Both first- and second-order approximations to a naturallanguage are considered. The guessing strategy used is guessing words in decreasing orderof probability. When word and alphabet sizes are large, approximations are necessary inorder to estimate the number of guesses. Several kinds of approximations are discusseddemonstrating moderate requirements regarding both memory and central processing unit(CPU) time. When considering realistic sizes of alphabets and words (100), the numberof guesses can be estimated within minutes with reasonable accuracy (a few percent) andmay therefore constitute an alternative to, e.g., various entropy expressions. For manyprobability distributions, the density of the logarithm of probability products is close to anormal distribution. For those cases, it is possible to derive an analytical expression for theaverage number of guesses. The proportion of guesses needed on average compared to thetotal number decreases almost exponentially with the word length. The leading term in anasymptotic expansion can be used to estimate the number of guesses for large word lengths.Comparisons with analytical lower bounds and entropy expressions are also provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract: This study was designed to validate a constructivist learning framework, herein referred to as Accessible Immersion Metrics (AIM), for second language acquisition (SLA) as well as to compare two delivery methods of the same framework. The AIM framework was originally developed in 2009 and is proposed as a “How to” guide for the application of constructivist learning principles to the second language classroom. Piloted in 2010 at Champlain College St-Lambert, the AIM model allows for language learning to occur, free of a fixed schedule, to be socially constructive through the use of task-based assessments and relevant to the learner’s life experience by focusing on the students’ needs rather than on course content.||Résumé : Cette étude a été principalement conçu pour valider un cadre d'apprentissage constructiviste, ci-après dénommé Accessible Immersion Metrics - AIM, pour l'acquisition d'une langue seconde - SLA. Le cadre de l'AIM est proposé comme un mode d'emploi pour l'application des principes constructivistes à l'apprentissage d’une langue seconde. Créé en 2009 par l'auteur, et piloté en 2010 au Collège Champlain St-Lambert, le modèle de l'AIM permet l'apprentissage des langues à se produire, sans horaire fixe et socialement constructive grâce à l'utilisation des évaluations alignées basées sur des tâches pertinentes à l'expérience de vie de l'étudiant en se concentrant sur les besoins des élèves plutôt que sur le contenu des cours.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Intravoxel incoherent motion (IVIM) is an MRI technique with potential applications in measuring brain tumor perfusion, but its clinical impact remains to be determined. We assessed the usefulness of IVIM-metrics in predicting survival in newly diagnosed glioblastoma. METHODS: Fifteen patients with glioblastoma underwent MRI including spin-echo echo-planar DWI using 13 b-values ranging from 0 to 1000 s/mm2. Parametric maps for diffusion coefficient (D), pseudodiffusion coefficient (D*), and perfusion fraction (f) were generated for contrast-enhancing regions (CER) and non-enhancing regions (NCER). Regions of interest were manually drawn in regions of maximum f and on the corresponding dynamic susceptibility contrast images. Prognostic factors were evaluated by Kaplan-Meier survival and Cox proportional hazards analyses. RESULTS: We found that fCER and D*CER correlated with rCBFCER. The best cutoffs for 6-month survival were fCER>9.86% and D*CER>21.712 x10-3mm2/s (100% sensitivity, 71.4% specificity, 100% and 80% positive predictive values, and 80% and 100% negative predictive values; AUC:0.893 and 0.857, respectively). Treatment yielded the highest hazard ratio (5.484; 95% CI: 1.162-25.88; AUC: 0.723; P = 0.031); fCER combined with treatment predicted survival with 100% accuracy. CONCLUSIONS: The IVIM-metrics fCER and D*CER are promising biomarkers of 6-month survival in newly diagnosed glioblastoma.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This analysis paper presents previously unknown properties of some special cases of the Wright function whose consideration is necessitated by our work on probability theory and the theory of stochastic processes. Specifically, we establish new asymptotic properties of the particular Wright function 1Ψ1(ρ, k; ρ, 0; x) = X∞ n=0 Γ(k + ρn) Γ(ρn) x n n! (|x| < ∞) when the parameter ρ ∈ (−1, 0)∪(0, ∞) and the argument x is real. In the probability theory applications, which are focused on studies of the Poisson-Tweedie mixtures, the parameter k is a non-negative integer. Several representations involving well-known special functions are given for certain particular values of ρ. The asymptotics of 1Ψ1(ρ, k; ρ, 0; x) are obtained under numerous assumptions on the behavior of the arguments k and x when the parameter ρ is both positive and negative. We also provide some integral representations and structural properties involving the ‘reduced’ Wright function 0Ψ1(−−; ρ, 0; x) with ρ ∈ (−1, 0) ∪ (0, ∞), which might be useful for the derivation of new properties of members of the power-variance family of distributions. Some of these imply a reflection principle that connects the functions 0Ψ1(−−;±ρ, 0; ·) and certain Bessel functions. Several asymptotic relationships for both particular cases of this function are also given. A few of these follow under additional constraints from probability theory results which, although previously available, were unknown to analysts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This was presented during the 2nd annual Library Research and Innovation Practices at the University of Maryland Libraries, McKeldin Library, on June 8, 2016.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This presentation was one of four during a Mid-Atlantic Regional Archives Conference presentation on April 15, 2016. Digitization of collections can help to improve internal workflows, make materials more accessible, and create new and engaging relationships with users. Laurie Gemmill Arp will discuss the LYRASIS Digitization Collaborative, created to assist institutions with their digitization needs, and how it has worked to help institutions increase connections with users. Robin Pike from the University of Maryland will discuss how they factor requests for access into selection for digitization and how they track the use of digitized materials. Laura Drake Davis of James Madison University will discuss the establishment of a formal digitization program, its impact on users, and the resulting increased use of their collections. Linda Tompkins-Baldwin will discuss Digital Maryland’s partnership with the Digital Public Library of America to provide access to archives held by institutions without a digitization program.

Relevância:

20.00% 20.00%

Publicador: