12 resultados para Unified Transform Kernel
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
We study the action of a weighted Fourier–Laplace transform on the functions in the reproducing kernel Hilbert space (RKHS) associated with a positive definite kernel on the sphere. After defining a notion of smoothness implied by the transform, we show that smoothness of the kernel implies the same smoothness for the generating elements (spherical harmonics) in the Mercer expansion of the kernel. We prove a reproducing property for the weighted Fourier–Laplace transform of the functions in the RKHS and embed the RKHS into spaces of smooth functions. Some relevant properties of the embedding are considered, including compactness and boundedness. The approach taken in the paper includes two important notions of differentiability characterized by weighted Fourier–Laplace transforms: fractional derivatives and Laplace–Beltrami derivatives.
Resumo:
Despite their generality, conventional Volterra filters are inadequate for some applications, due to the huge number of parameters that may be needed for accurate modelling. When a state-space model of the target system is known, this can be assessed by computing its kernels, which also provides valuable information for choosing an adequate alternate Volterra filter structure, if necessary, and is useful for validating parameter estimation procedures. In this letter, we derive expressions for the kernels by using the Carleman bilinearization method, for which an efficient algorithm is given. Simulation results are presented, which confirm the usefulness of the proposed approach.
Resumo:
Vortex-induced motion (VIM) is a highly nonlinear dynamic phenomenon. Usual spectral analysis methods, using the Fourier transform, rely on the hypotheses of linear and stationary dynamics. A method to treat nonstationary signals that emerge from nonlinear systems is denoted Hilbert-Huang transform (HHT) method. The development of an analysis methodology to study the VIM of a monocolumn production, storage, and offloading system using HHT is presented. The purposes of the present methodology are to improve the statistics analysis of VIM. The results showed to be comparable to results obtained from a traditional analysis (mean of the 10% highest peaks) particularly for the motions in the transverse direction, although the difference between the results from the traditional analysis for the motions in the in-line direction showed a difference of around 25%. The results from the HHT analysis are more reliable than the traditional ones, owing to the larger number of points to calculate the statistics characteristics. These results may be used to design risers and mooring lines, as well as to obtain VIM parameters to calibrate numerical predictions. [DOI: 10.1115/1.4003493]
Resumo:
In this paper, a definition of the Hilbert transform operating on Colombeau's temperated generalized functions is given. Similar results to some theorems that hold in the classical theory, or in certain subspaces of Schwartz distributions, have been obtained in this framework.
Resumo:
In the process of creation of the Unified Health System (SUS) as a universal policy seeking to ensure comprehensive care, unscheduled assistance in primary healthcare units (UBS) is an unresolved challenge. The scope of this paper is to analyze the viewpoint of health professionals on the role of primary healthcare units in meeting this demand. It is a transversal study of qualitative data obtained through questionnaires and interviews with 106 medical practitioners from 6 emergency medical services and 190 professionals from 30 units. They explained why people seek emergency care for occurrences pertaining to primary care. The content analysis technique with thematic categories was used for data analysis. Lack of resources and problems with primary health unit work processes (50.8%) were the reasons most frequently cited by emergency care physicians to explain this inadequate demand. Only 33.3% of the health unit professionals agreed that these occurrences should be attended in the primary healthcare services. The limited viewpoint of the role of health services on the unscheduled care, particularly among primary care professionals, possibly leads to restrictive practices for access by the population.
Resumo:
We present a method of generation of exact and explicit forms of one-sided, heavy-tailed Levy stable probability distributions g(alpha)(x), 0 <= x < infinity, 0 < alpha < 1. We demonstrate that the knowledge of one such a distribution g a ( x) suffices to obtain exactly g(alpha)p ( x), p = 2, 3, .... Similarly, from known g(alpha)(x) and g(beta)(x), 0 < alpha, beta < 1, we obtain g(alpha beta)( x). The method is based on the construction of the integral operator, called Levy transform, which implements the above operations. For a rational, alpha = l/k with l < k, we reproduce in this manner many of the recently obtained exact results for g(l/k)(x). This approach can be also recast as an application of the Efros theorem for generalized Laplace convolutions. It relies solely on efficient definite integration. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4709443]
Resumo:
Item response theory (IRT) comprises a set of statistical models which are useful in many fields, especially when there is an interest in studying latent variables (or latent traits). Usually such latent traits are assumed to be random variables and a convenient distribution is assigned to them. A very common choice for such a distribution has been the standard normal. Recently, Azevedo et al. [Bayesian inference for a skew-normal IRT model under the centred parameterization, Comput. Stat. Data Anal. 55 (2011), pp. 353-365] proposed a skew-normal distribution under the centred parameterization (SNCP) as had been studied in [R. B. Arellano-Valle and A. Azzalini, The centred parametrization for the multivariate skew-normal distribution, J. Multivariate Anal. 99(7) (2008), pp. 1362-1382], to model the latent trait distribution. This approach allows one to represent any asymmetric behaviour concerning the latent trait distribution. Also, they developed a Metropolis-Hastings within the Gibbs sampling (MHWGS) algorithm based on the density of the SNCP. They showed that the algorithm recovers all parameters properly. Their results indicated that, in the presence of asymmetry, the proposed model and the estimation algorithm perform better than the usual model and estimation methods. Our main goal in this paper is to propose another type of MHWGS algorithm based on a stochastic representation (hierarchical structure) of the SNCP studied in [N. Henze, A probabilistic representation of the skew-normal distribution, Scand. J. Statist. 13 (1986), pp. 271-275]. Our algorithm has only one Metropolis-Hastings step, in opposition to the algorithm developed by Azevedo et al., which has two such steps. This not only makes the implementation easier but also reduces the number of proposal densities to be used, which can be a problem in the implementation of MHWGS algorithms, as can be seen in [R.J. Patz and B.W. Junker, A straightforward approach to Markov Chain Monte Carlo methods for item response models, J. Educ. Behav. Stat. 24(2) (1999), pp. 146-178; R. J. Patz and B. W. Junker, The applications and extensions of MCMC in IRT: Multiple item types, missing data, and rated responses, J. Educ. Behav. Stat. 24(4) (1999), pp. 342-366; A. Gelman, G.O. Roberts, and W.R. Gilks, Efficient Metropolis jumping rules, Bayesian Stat. 5 (1996), pp. 599-607]. Moreover, we consider a modified beta prior (which generalizes the one considered in [3]) and a Jeffreys prior for the asymmetry parameter. Furthermore, we study the sensitivity of such priors as well as the use of different kernel densities for this parameter. Finally, we assess the impact of the number of examinees, number of items and the asymmetry level on the parameter recovery. Results of the simulation study indicated that our approach performed equally as well as that in [3], in terms of parameter recovery, mainly using the Jeffreys prior. Also, they indicated that the asymmetry level has the highest impact on parameter recovery, even though it is relatively small. A real data analysis is considered jointly with the development of model fitting assessment tools. The results are compared with the ones obtained by Azevedo et al. The results indicate that using the hierarchical approach allows us to implement MCMC algorithms more easily, it facilitates diagnosis of the convergence and also it can be very useful to fit more complex skew IRT models.
Resumo:
The method of steepest descent is used to study the integral kernel of a family of normal random matrix ensembles with eigenvalue distribution P-N (z(1), ... , z(N)) = Z(N)(-1)e(-N)Sigma(N)(i=1) V-alpha(z(i)) Pi(1 <= i<j <= N) vertical bar z(i) - z(j)vertical bar(2), where V-alpha(z) = vertical bar z vertical bar(alpha), z epsilon C and alpha epsilon inverted left perpendicular0, infinity inverted right perpendicular. Asymptotic formulas with error estimate on sectors are obtained. A corollary of these expansions is a scaling limit for the n-point function in terms of the integral kernel for the classical Segal-Bargmann space. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.3688293]
Resumo:
Oil content and grain yield in maize are negatively correlated, and so far the development of high-oil high-yielding hybrids has not been accomplished. Then a fully understand of the inheritance of the kernel oil content is necessary to implement a breeding program to improve both traits simultaneously. Conventional and molecular marker analyses of the design III were carried out from a reference population developed from two tropical inbred lines divergent for kernel oil content. The results showed that additive variance was quite larger than the dominance variance, and the heritability coefficient was very high. Sixteen QTL were mapped, they were not evenly distributed along the chromosomes, and accounted for 30.91% of the genetic variance. The average level of dominance computed from both conventional and QTL analysis was partial dominance. The overall results indicated that the additive effects were more important than the dominance effects, the latter were not unidirectional and then heterosis could not be exploited in crosses. Most of the favorable alleles of the QTL were in the high-oil parental inbred, which could be transferred to other inbreds via marker-assisted backcross selection. Our results coupled with reported information indicated that the development of high-oil hybrids with acceptable yields could be accomplished by using marker-assisted selection involving oil content, grain yield and its components. Finally, to exploit the xenia effect to increase even more the oil content, these hybrids should be used in the Top Cross((TM)) procedure.
Resumo:
This study aimed to evaluate the chemical interaction of collagen with some substances usually applied in dental treatments to increase the durability of adhesive restorations to dentin. Initially, the similarity between human dentin collagen and type I collagen obtained from commercial bovine membranes of Achilles deep tendon was compared by the Attenuated Total Reflectance technique of Fourier Transform Infrared (ATR-FTIR) spectroscopy. Finally, the effects of application of 35% phosphoric acid, 0.1M ethylenediaminetetraacetic acid (EDTA), 2% chlorhexidine, and 6.5% proanthocyanidin solution on microstructure of collagen and in the integrity of its triple helix were also evaluated by ATR-FTIR. It was observed that the commercial type I collagen can be used as an efficient substitute for demineralized human dentin in studies that use spectroscopy analysis. The 35% phosphoric acid significantly altered the organic content of amides, proline and hydroxyproline of type I collagen. The surface treatment with 0.1M EDTA, 2% chlorhexidine, or 6.5% proanthocyanidin did not promote deleterious structural changes to the collagen triple helix. The application of 6.5% proanthocyanidin on collagen promoted hydrogen bond formation. (c) 2012 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 2012.
Resumo:
We analyze reproducing kernel Hilbert spaces of positive definite kernels on a topological space X being either first countable or locally compact. The results include versions of Mercer's theorem and theorems on the embedding of these spaces into spaces of continuous and square integrable functions.
Resumo:
In multi-label classification, examples can be associated with multiple labels simultaneously. The task of learning from multi-label data can be addressed by methods that transform the multi-label classification problem into several single-label classification problems. The binary relevance approach is one of these methods, where the multi-label learning task is decomposed into several independent binary classification problems, one for each label in the set of labels, and the final labels for each example are determined by aggregating the predictions from all binary classifiers. However, this approach fails to consider any dependency among the labels. Aiming to accurately predict label combinations, in this paper we propose a simple approach that enables the binary classifiers to discover existing label dependency by themselves. An experimental study using decision trees, a kernel method as well as Naive Bayes as base-learning techniques shows the potential of the proposed approach to improve the multi-label classification performance.