19 resultados para Caratheodori Class Function
Resumo:
We consider integral equations of the form ψ(x) = φ(x) + ∫Ωk(x, y)z(y)ψ(y) dy(in operator form ψ = φ + Kzψ), where Ω is some subset ofRn(n ≥ 1). The functionsk,z, and φ are assumed known, withz ∈ L∞(Ω) and φ ∈ Y, the space of bounded continuous functions on Ω. The function ψ ∈ Yis to be determined. The class of domains Ω and kernelskconsidered includes the case Ω = Rnandk(x, y) = κ(x − y) with κ ∈ L1(Rn), in which case, ifzis the characteristic function of some setG, the integral equation is one of Wiener–Hopf type. The main theorems, proved using arguments derived from collectively compact operator theory, are conditions on a setW ⊂ L∞(Ω) which ensure that ifI − Kzis injective for allz ∈ WthenI − Kzis also surjective and, moreover, the inverse operators (I − Kz)−1onYare bounded uniformly inz. These general theorems are used to recover classical results on Wiener–Hopf integral operators of21and19, and generalisations of these results, and are applied to analyse the Lippmann–Schwinger integral equation.
Resumo:
This contribution proposes a novel probability density function (PDF) estimation based over-sampling (PDFOS) approach for two-class imbalanced classification problems. The classical Parzen-window kernel function is adopted to estimate the PDF of the positive class. Then according to the estimated PDF, synthetic instances are generated as the additional training data. The essential concept is to re-balance the class distribution of the original imbalanced data set under the principle that synthetic data sample follows the same statistical properties. Based on the over-sampled training data, the radial basis function (RBF) classifier is constructed by applying the orthogonal forward selection procedure, in which the classifier’s structure and the parameters of RBF kernels are determined using a particle swarm optimisation algorithm based on the criterion of minimising the leave-one-out misclassification rate. The effectiveness of the proposed PDFOS approach is demonstrated by the empirical study on several imbalanced data sets.
Resumo:
tWe develop an orthogonal forward selection (OFS) approach to construct radial basis function (RBF)network classifiers for two-class problems. Our approach integrates several concepts in probabilisticmodelling, including cross validation, mutual information and Bayesian hyperparameter fitting. At eachstage of the OFS procedure, one model term is selected by maximising the leave-one-out mutual infor-mation (LOOMI) between the classifier’s predicted class labels and the true class labels. We derive theformula of LOOMI within the OFS framework so that the LOOMI can be evaluated efficiently for modelterm selection. Furthermore, a Bayesian procedure of hyperparameter fitting is also integrated into theeach stage of the OFS to infer the l2-norm based local regularisation parameter from the data. Since eachforward stage is effectively fitting of a one-variable model, this task is very fast. The classifier construc-tion procedure is automatically terminated without the need of using additional stopping criterion toyield very sparse RBF classifiers with excellent classification generalisation performance, which is par-ticular useful for the noisy data sets with highly overlapping class distribution. A number of benchmarkexamples are employed to demonstrate the effectiveness of our proposed approach.
Resumo:
Starting with the work of Lapidus and van Frankenhuysen a number of papers have introduced zeta functions as a way of capturing multifractal information. In this paper we propose a new multifractal zeta function and show that under certain conditions the abscissa of convergence yields the Hausdorff multifractal spectrum for a class of measures.