24 resultados para Probability Density Function

em Deakin Research Online - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automatically partitioning instructional videos into topic sections is a challenging problem in e-learning environments for efficient content management and cataloging. This paper addresses this problem by proposing a novel density function to delineate sections underscored by changes in topics in instructional and training videos. The content density function draws guidance from the observation that topic boundaries coincide with the ebb and flow of the 'density' of content shown in these videos. Based on this function, we propose two methods for high-level segmentation by determining topic boundaries. We study the performance of the two methods on eight training videos, and our experimental results demonstrate the effectiveness and robustness of the two proposed high-level segmentation algorithms for learning media.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate assessment of the fate of salts, nutrients, and pollutants in natural, heterogeneous soils requires a proper quantification of both spatial and temporal solute spreading during solute movement. The number of experiments with multisampler devices that measure solute leaching as a function of space and time is increasing. The breakthrough curve (BTC) can characterize the temporal aspect of solute leaching, and recently the spatial solute distribution curve (SSDC) was introduced to describe the spatial solute distribution. We combined and extended both concepts to develop a tool for the comprehensive analysis of the full spatio-temporal behavior of solute leaching. The sampling locations are ranked in order of descending amount of total leaching (defined as the cumulative leaching from an individual compartment at the end of the experiment), thus collapsing both spatial axes of the sampling plane into one. The leaching process can then be described by a curved surface that is a function of the single spatial coordinate and time. This leaching surface is scaled to integrate to unity, and termed S can efficiently represent data from multisampler solute transport experiments or simulation results from multidimensional solute transport models. The mathematical relationships between the scaled leaching surface S, the BTC, and the SSDC are established. Any desired characteristic of the leaching process can be derived from S. The analysis was applied to a chloride leaching experiment on a lysimeter with 300 drainage compartments of 25 cm2 each. The sandy soil monolith in the lysimeter exhibited fingered flow in the water-repellent top layer. The observed S demonstrated the absence of a sharp separation between fingers and dry areas, owing to diverging flow in the wettable soil below the fingers. Times-to-peak, maximum solute fluxes, and total leaching varied more in high-leaching than in low-leaching compartments. This suggests a stochastic–convective transport process in the high-flow streamtubes, while convection–dispersion is predominant in the low-flow areas. S can be viewed as a bivariate probability density function. Its marginal distributions are the BTC of all sampling locations combined, and the SSDC of cumulative solute leaching at the end of the experiment. The observed S cannot be represented by assuming complete independence between its marginal distributions, indicating that S contains information about the leaching process that cannot be derived from the combination of the BTC and the SSDC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, a patchwork-based audio watermarking scheme has been proposed in [1], which embeds watermarks by modifying the means of absolute-valued discrete cosine transform (DCT) coefficients corresponding to suitable fragments. This audio watermarking scheme is more robust to common attacks than the existing counterparts. In this paper, we presents a detailed analysis of this audio watermarking scheme. We first derive a probability density function (pdf) of a random variable corresponding to the mean of an absolute-valued DCT fragment. Then, based on the obtained pdf, we show how watermarking parameters affect the performance of the concerned audio watermarking scheme. The analysis result provides a guideline for the selection of watermarking parameters. The effectiveness of our analysis is verified by simulations using a large number of real-world audio segments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developing a watermarking method that is robust to cropping attack is a challenging task in image watermarking. The moment-based watermarking schemes show good robustness to common signal processing attacks and some geometric attacks but are sensitive to cropping attack. In this paper, we modify the moment-based approach to deal with cropping attack. Firstly, we find the probability density function (pdf) of the pixel value distribution from the original image. Secondly, we reshape and normalize the pdf of the pixel value distribution (PPVD) to form a two dimensional image. Then, the moment invariants are calculated from the PPVD image. Since PPVD is insensitive to cropping, the proposed method is robust to cropping attack. Besides, it also has high robustness against other common attacks. Experimental results demonstrate the effectiveness of the proposed method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developing a watermarking method that is robust to cropping attack is a challenging task in image watermarking. The moment-based watermarking schemes show good robustness to common signal processing attacks and some geometric attacks but are sensitive to cropping attack. In this paper, we modify the moment-based approach to deal with cropping attack. Firstly, we find the probability density function (PDF) of the pixel value distribution from the original image. Secondly, we reshape and normalize the pdf of the pixel value distribution (PPVD) to form a two dimensional image. Then, the moment invariants are calculated from the PPVD image. Since PPVD is insensitive to cropping, the proposed method is robust to cropping attack. Besides, it also has high robustness against other common attacks. Theoretical analysis and experimental results demonstrate the effectiveness of the proposed method.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The acceptance/rejection approach is widely used in universal nonuniform random number generators. Its key part is an accurate approximation of a given probability density from above by a hat function. This article uses a piecewise constant hat function, whose values are overestimates of the density on the elements of the partition of the domain. It uses a sawtooth overestimate of Lipschitz continuous densities, and then examines all local maximizers of such an overestimate. The method is applicable to multivariate multimodal distributions. It exhibits relatively short preprocessing time and fast generation of random variates from a very large class of distributions

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mineral Prospectivity Mapping is the process of combining maps containing different geoscientific data sets to produce a single map depicting areas ranked according to their potential to host mineral deposits of a particular type. This paper outlines two approaches for deriving a function which can be used to assign to each cell in the study area a value representing the posterior probability that the cell contains a deposit of the sought-after mineral. One approach is based on estimating probability density functions (pdfs); the second uses multilayer perceptrons (MLPs). Results are provided from applying these approaches to geoscientific datasets covering a region in North Western Victoria, Australia. The results demonstrate that while both the Bayesian approach and the MLP approach yield similar results when the number of input dimensions is small, the Bayesian approach rapidly becomes unstable as the number of input dimensions increases, with the resulting maps displaying high sensitivity to the number of mixtures used to model the distributions. However, despite the fact that Bayesian assigned values cannot be interpreted as posterior probabilities in high dimensional input spaces, the pixel favorability rankings produced by the two methods is similar.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A hidden martingale restriction is developed for option pricing models based on Gram-Charlier expansions of the normal density function. The restriction is hidden behind a reduction in parameter space for the Gram-Charlier expansion coefficients. The resulting restriction is invisible in the option price.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ten anionic compounds, including four acidic dyes, were used to dope polypyrrole powder. The effects of the dopants on density, optical absorption and conductivity of the polypyrroles were studied. The presence of the dopant in the conducting polymer matrix was verified by ATR-FTIR spectroscopy. Density function theory (DFT) simulation was used to understand the effect of the dopants on the solid structure, optical absorption and energy band structures. Anthraquinone-2-sulfonic acid-doped polypyrrole yielded the highest conductivity. The dye-doped polypyrrole showed an enhancement in its UV–vis optical absorption.


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes a practical and cost-effective approach to construct a fully distributed roadside communication infrastructure to facilitate the localized content dissemination to vehicles in the urban area. The proposed infrastructure is composed of distributed lightweight low-cost devices called roadside buffers (RSBs), where each RSB has the limited buffer storage and is able to transmit wirelessly the cached contents to fast-moving vehicles. To enable the distributed RSBs working toward the global optimal performance (e.g., minimal average file download delays), we propose a fully distributed algorithm to determine optimally the content replication strategy at RSBs. Specifically, we first develop a generic analytical model to evaluate the download delay of files, given the probability density of file distribution at RSBs. Then, we formulate the RSB content replication process as an optimization problem and devise a fully distributed content replication scheme accordingly to enable vehicles to recommend intelligently the desirable content files to RSBs. The proposed infrastructure is designed to optimize the global network utility, which accounts for the integrated download experience of users and the download demands of files. Using extensive simulations, we validate the effectiveness of the proposed infrastructure and show that the proposed distributed protocol can approach to the optimal performance and can significantly outperform the traditional heuristics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Among the current clustering algorithms of complex networks, Laplacian-based spectral clustering algorithms have the advantage of rigorous mathematical basis and high accuracy. However, their applications are limited due to their dependence on prior knowledge, such as the number of clusters. For most of application scenarios, it is hard to obtain the number of clusters beforehand. To address this problem, we propose a novel clustering algorithm - Jordan-Form of Laplacian-Matrix based Clustering algorithm (JLMC). In JLMC, we propose a model to calculate the number (n) of clusters in a complex network based on the Jordan-Form of its corresponding Laplacian matrix. JLMC clusters the network into n clusters by using our proposed modularity density function (P function). We conduct extensive experiments over real and synthetic data, and the experimental results reveal that JLMC can accurately obtain the number of clusters in a complex network, and outperforms Fast-Newman algorithm and Girvan-Newman algorithm in terms of clustering accuracy and time complexity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Complexity analysis of a given time series is executed using various measures of irregularity, the most commonly used being Approximate entropy (ApEn), Sample entropy (SampEn) and Fuzzy entropy (FuzzyEn). However, the dependence of these measures on the critical parameter of tolerance `r' leads to precarious results, owing to random selections of r. Attempts to eliminate the use of r in entropy calculations introduced a new measure of entropy namely distribution entropy (DistEn) based on the empirical probability distribution function (ePDF). DistEn completely avoids the use of a variance dependent parameter like r and replaces it by a parameter M, which corresponds to the number of bins used in the histogram to calculate it. When tested for synthetic data, M has been observed to produce a minimal effect on DistEn as compared to the effect of r on other entropy measures. Also, DistEn is said to be relatively stable with data length (N) variations, as far as synthetic data is concerned. However, these claims have not been analyzed for physiological data. Our study evaluates the effect of data length N and bin number M on the performance of DistEn using both synthetic and physiologic time series data. Synthetic logistic data of `Periodic' and `Chaotic' levels of complexity and 40 RR interval time series belonging to two groups of healthy aging population (young and elderly) have been used for the analysis. The stability and consistency of DistEn as a complexity measure as well as a classifier have been studied. Experiments prove that the parameters N and M are more influential in deciding the efficacy of DistEn performance in the case of physiologic data than synthetic data. Therefore, a generalized random selection of M for a given data length N may not always be an appropriate combination to yield good performance of DistEn for physiologic data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

FRAX(©) evaluates 10-year fracture probabilities and can be calculated with and without bone mineral density (BMD). Low socioeconomic status (SES) may affect BMD, and is associated with increased fracture risk. Clinical risk factors differ by SES; however, it is unknown whether aninteraction exists between SES and FRAX determined with and without the BMD. From the Geelong Osteoporosis Study, we drew 819 females aged ≥50 years. Clinical data were collected during 1993-1997. SES was determined by cross-referencing residential addresses with Australian Bureau of Statistics census data and categorized in quintiles. BMD was measured by dual energy X-ray absorptiometry at the same time as other clinical data were collected. Ten-year fracture probabilities were calculated using FRAX (Australia). Using multivariable regression analyses, we examined whether interactions existed between SES and 10-year probability for hip and any major osteoporotic fracture (MOF) defined by use of FRAX with and without BMD. We observed a trend for a SES * FRAX(no-BMD) interaction term for 10-year hip fracture probability (p = 0.09); however, not for MOF (p = 0.42). In women without prior fracture (n = 518), we observed a significant SES * FRAX(no-BMD) interaction term for hip fracture (p = 0.03) and MOF (p = 0.04). SES does not appear to have an interaction with 10-year fracture probabilities determined by FRAX with and without BMD in women with previous fracture; however, it does appear to exist for those without previous fracture.