967 resultados para Gravitational kernel


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we consider the problem of time series classification. Using piecewise linear interpolation various novel kernels are obtained which can be used with Support vector machines for designing classifiers capable of deciding the class of a given time series. The approach is general and is applicable in many scenarios. We apply the method to the task of Online Tamil handwritten character recognition with promising results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work the collapsing process of a spherically symmetric star, made of dust cloud, in the background of dark energy is studied for two different gravity theories separately, i.e., DGP Brane gravity and Loop Quantum gravity. Two types of dark energy fluids, namely, Modified Chaplygin gas and Generalised Cosmic Chaplygin gas are considered for each model. Graphs are drawn to characterize the nature and the probable outcome of gravitational collapse. A comparative study is done between the collapsing process in the two different gravity theories. It is found that in case of dark matter, there is a great possibility of collapse and consequent formation of Black hole. In case of dark energy possibility of collapse is far lesser compared to the other cases, due to the large negative pressure of dark energy component. There is an increase in mass of the cloud in case of dark matter collapse due to matter accumulation. The mass decreases considerably in case of dark energy due to dark energy accretion on the cloud. In case of collapse with a combination of dark energy and dark matter, it is found that in the absence of interaction there is a far better possibility of formation of black hole in DGP brane model compared to Loop quantum cosmology model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we study the problem of designing SVM classifiers when the kernel matrix, K, is affected by uncertainty. Specifically K is modeled as a positive affine combination of given positive semi definite kernels, with the coefficients ranging in a norm-bounded uncertainty set. We treat the problem using the Robust Optimization methodology. This reduces the uncertain SVM problem into a deterministic conic quadratic problem which can be solved in principle by a polynomial time Interior Point (IP) algorithm. However, for large-scale classification problems, IP methods become intractable and one has to resort to first-order gradient type methods. The strategy we use here is to reformulate the robust counterpart of the uncertain SVM problem as a saddle point problem and employ a special gradient scheme which works directly on the convex-concave saddle function. The algorithm is a simplified version of a general scheme due to Juditski and Nemirovski (2011). It achieves an O(1/T-2) reduction of the initial error after T iterations. A comprehensive empirical study on both synthetic data and real-world protein structure data sets show that the proposed formulations achieve the desired robustness, and the saddle point based algorithm outperforms the IP method significantly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A natural class of weighted Bergman spaces on the symmetrized polydisc is isometrically embedded as a subspace in the corresponding weighted Bergman space on the polydisc. We find an orthonormal basis for this subspace. It enables us to compute the kernel function for the weighted Bergman spaces on the symmetrized polydisc using the explicit nature of our embedding. This family of kernel functions includes the Szego and the Bergman kernel on the symmetrized polydisc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to obtain certain characterizations for the image of a Sobolev space on the Heisenberg group under the heat kernel transform. We give three types of characterizations for the image of a Sobolev space of positive order H-m (H-n), m is an element of N-n, under the heat kernel transform on H-n, using direct sum and direct integral of Bergmann spaces and certain unitary representations of H-n which can be realized on the Hilbert space of Hilbert-Schmidt operators on L-2 (R-n). We also show that the image of Sobolev space of negative order H-s (H-n), s(> 0) is an element of R is a direct sum of two weighted Bergman spaces. Finally, we try to obtain some pointwise estimates for the functions in the image of Schwartz class on H-n under the heat kernel transform. (C) 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider four-dimensional CFTs which admit a large-N expansion, and whose spectrum contains states whose conformal dimensions do not scale with N. We explicitly reorganise the partition function obtained by exponentiating the one-particle partition function of these states into a heat kernel form for the dual string spectrum on AdS(5). On very general grounds, the heat kernel answer can be expressed in terms of a convolution of the one-particle partition function of the light states in the four-dimensional CFT. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider generalized gravitational entropy in various higher derivative theories of gravity dual to four dimensional CFTs using the recently proposed regularization of squashed cones. We derive the universal terms in the entanglement entropy for spherical and cylindrical surfaces. This is achieved by constructing the Fefferman-Graham expansion for the leading order metrics for the bulk geometry and evaluating the generalized gravitational entropy. We further show that the Wald entropy evaluated in the bulk geometry constructed for the regularized squashed cones leads to the correct universal parts of the entanglement entropy for both spherical and cylindrical entangling surfaces. We comment on the relation with the Iyer-Wald formula for dynamical horizons relating entropy to a Noether charge. Finally we show how to derive the entangling surface equation in Gauss-Bonnet holography.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent focus of flood frequency analysis (FFA) studies has been on development of methods to model joint distributions of variables such as peak flow, volume, and duration that characterize a flood event, as comprehensive knowledge of flood event is often necessary in hydrological applications. Diffusion process based adaptive kernel (D-kernel) is suggested in this paper for this purpose. It is data driven, flexible and unlike most kernel density estimators, always yields a bona fide probability density function. It overcomes shortcomings associated with the use of conventional kernel density estimators in FFA, such as boundary leakage problem and normal reference rule. The potential of the D-kernel is demonstrated by application to synthetic samples of various sizes drawn from known unimodal and bimodal populations, and five typical peak flow records from different parts of the world. It is shown to be effective when compared to conventional Gaussian kernel and the best of seven commonly used copulas (Gumbel-Hougaard, Frank, Clayton, Joe, Normal, Plackett, and Student's T) in estimating joint distribution of peak flow characteristics and extrapolating beyond historical maxima. Selection of optimum number of bins is found to be critical in modeling with D-kernel.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important question in kernel regression is one of estimating the order and bandwidth parameters from available noisy data. We propose to solve the problem within a risk estimation framework. Considering an independent and identically distributed (i.i.d.) Gaussian observations model, we use Stein's unbiased risk estimator (SURE) to estimate a weighted mean-square error (MSE) risk, and optimize it with respect to the order and bandwidth parameters. The two parameters are thus spatially adapted in such a manner that noise smoothing and fine structure preservation are simultaneously achieved. On the application side, we consider the problem of image restoration from uniform/non-uniform data, and show that the SURE approach to spatially adaptive kernel regression results in better quality estimation compared with its spatially non-adaptive counterparts. The denoising results obtained are comparable to those obtained using other state-of-the-art techniques, and in some scenarios, superior.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The low-surface-brightness galaxies are gas rich and yet have a low star formation rate; this is a well-known puzzle. The spiral features in these galaxies are weak and difficult to trace, although this aspect has not been studied much. These galaxies are known to be dominated by the dark matter halo from the innermost regions. Here, we do a stability analysis for the galactic disc of UGC 7321, a low-surface-brightness, superthin galaxy, for which the various observational input parameters are available. We show that the disc is stable against local, linear axisymmetric and non-axisymmetric perturbations. The Toomre Q parameter values are found to be large (>> 1) mainly due to the low disc surface density, and the high rotation velocity resulting due to the dominant dark matter halo, which could explain the observed low star formation rate. For the stars-alone case, the disc shows finite swing amplification but the addition of dark matter halo suppresses that amplification almost completely. Even the inclusion of the low-dispersion gas which constitutes a high disc mass fraction does not help in causing swing amplification. This can explain why these galaxies do not show strong spiral features. Thus, the dynamical effect of a halo that is dominant from inner regions can naturally explain why star formation and spiral features are largely suppressed in low-surface-brightness galaxies, making these different from the high-surface-brightness galaxies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regionalization approaches are widely used in water resources engineering to identify hydrologically homogeneous groups of watersheds that are referred to as regions. Pooled information from sites (depicting watersheds) in a region forms the basis to estimate quantiles associated with hydrological extreme events at ungauged/sparsely gauged sites in the region. Conventional regionalization approaches can be effective when watersheds (data points) corresponding to different regions can be separated using straight lines or linear planes in the space of watershed related attributes. In this paper, a kernel-based Fuzzy c-means (KFCM) clustering approach is presented for use in situations where such linear separation of regions cannot be accomplished. The approach uses kernel-based functions to map the data points from the attribute space to a higher-dimensional space where they can be separated into regions by linear planes. A procedure to determine optimal number of regions with the KFCM approach is suggested. Further, formulations to estimate flood quantiles at ungauged sites with the approach are developed. Effectiveness of the approach is demonstrated through Monte-Carlo simulation experiments and a case study on watersheds in United States. Comparison of results with those based on conventional Fuzzy c-means clustering, Region-of-influence approach and a prior study indicate that KFCM approach outperforms the other approaches in forming regions that are closer to being statistically homogeneous and in estimating flood quantiles at ungauged sites. Key Points Kernel-based regionalization approach is presented for flood frequency analysis Kernel procedure to estimate flood quantiles at ungauged sites is developed A set of fuzzy regions is delineated in Ohio, USA

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Carbonization of milk-free coconut kernel pulp is carried out at low temperatures. The carbon samples are activated using KOH, and electrical double-layer capacitor (EDLC) properties are studied. Among the several samples prepared, activated carbon prepared at 600 A degrees C has a large surface area (1,200 m(2) g(-1)). There is a decrease in surface area with increasing temperature of preparation. Cyclic voltammetry and galvanostatic charge-discharge studies suggest that activated carbons derived from coconut kernel pulp are appropriate materials for EDLC studies in acidic, alkaline, and non-aqueous electrolytes. Specific capacitance of 173 F g(-1) is obtained in 1 M H2SO4 electrolyte for the activated carbon prepared at 600 A degrees C. The supercapacitor properties of activated carbon sample prepared at 600 A degrees C are superior to the samples prepared at higher temperatures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compute the instantaneous contributions to the spherical harmonic modes of gravitational waveforms from compact binary systems in general orbits up to the third post-Newtonian (PN) order. We further extend these results for compact binaries in quasielliptical orbits using the 3PN quasi-Keplerian representation of the conserved dynamics of compact binaries in eccentric orbits. Using the multipolar post-Minkowskian formalism, starting from the different mass and current-type multipole moments, we compute the spin-weighted spherical harmonic decomposition of the instantaneous part of the gravitational waveform. These are terms which are functions of the retarded time and do not depend on the history of the binary evolution. Together with the hereditary part, which depends on the binary's dynamical history, these waveforms form the basis for construction of accurate templates for the detection of gravitational wave signals from binaries moving in quasielliptical orbits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Support vector machines (SVM) are a popular class of supervised models in machine learning. The associated compute intensive learning algorithm limits their use in real-time applications. This paper presents a fully scalable architecture of a coprocessor, which can compute multiple rows of the kernel matrix in parallel. Further, we propose an extended variant of the popular decomposition technique, sequential minimal optimization, which we call hybrid working set (HWS) algorithm, to effectively utilize the benefits of cached kernel columns and the parallel computational power of the coprocessor. The coprocessor is implemented on Xilinx Virtex 7 field-programmable gate array-based VC707 board and achieves a speedup of upto 25x for kernel computation over single threaded computation on Intel Core i5. An application speedup of upto 15x over software implementation of LIBSVM and speedup of upto 23x over SVMLight is achieved using the HWS algorithm in unison with the coprocessor. The reduction in the number of iterations and sensitivity of the optimization time to variation in cache size using the HWS algorithm are also shown.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Helmke et al. have recently given a formula for the number of reachable pairs of matrices over a finite field. We give a new and elementary proof of the same formula by solving the equivalent problem of determining the number of so called zero kernel pairs over a finite field. We show that the problem is, equivalent to certain other enumeration problems and outline a connection with some recent results of Guo and Yang on the natural density of rectangular unimodular matrices over F-qx]. We also propose a new conjecture on the density of unimodular matrix polynomials. (C) 2016 Elsevier Inc. All rights reserved.