38 resultados para Unified Transform Kernel
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
In 1749, Jacques de Vaucanson patented his or tour pour tirer la soie or spindle for silk reeling. In that same year he presented his invention to the Academy of the Sciences in Paris, of which he was a member1. Jacques de Vaucanson was born in Grenoble, France, in 1709, and died in Paris in 1782. In 1741 he had been appointed inspector of silk manufactures by Louis XV. He set about reorganizing the silk industry in France, in considerable difficulty at the time due to foreign competition. Given Vaucanson’s position, his invention was intended to replace the traditional Piémontes method, and had an immediate impact upon the silk industry in France and all over Europe.
Resumo:
Treball de recerca realitzat per una alumna d'ensenyament secundari i guardonat amb un Premi CIRIT per fomentar l'esperit científic del Jovent l'any 2009. El treball es centra en conèixer la complexitat d’un estudi fotogràfic del s. XIX: l’estudi Napoleón. Per entendre tots els aspectes que implicava fer una fotografia en aquest estudi comença explicant com es van desenvolupar i descobrir les diferents tècniques fotogràfiques, després presenta l’estat de la fotografia a la Catalunya del s. XIX. El nucli del treball té diferents aspectes: per una banda s’investiga la història dels fundadors d’un dels estudis més importants a la Barcelona del s. XIX, per l’altra presenta com eren les sales, els decorats, els clients, la tipografia, les càmeres .... i per últim, porta a la pràctica tot allò necessari per a transformar un paper blanc en una fotografia fent servir els mètodes de l’època. Podríem dir que el treball es desenvolupa en tres àmbits: el primer sobre els fonaments tècnics i històrics de la fotografia, les fonts utilitzades per realitzar aquest apartat han estat fonamentalment bibliogràfiques; el segon fa referència a l’estudi fotogràfic dels Napoleón, en aquest cas, a part de les fonts bibliogràfiques, també ha estat de vital importància la informació aportada per un descendent de la família i finalment s’explica els procediments que es van fer servir per obtenir imatges durant el segle s.XIX i les reaccions químiques en les quals es fonamenten. Aporta també una part experimental que dóna un caire artístic i novedós al treball.
Resumo:
Vegeu el resum a l'inici del document del fitxer adjunt.
Resumo:
In the PhD thesis “Sound Texture Modeling” we deal with statistical modelling or textural sounds like water, wind, rain, etc. For synthesis and classification. Our initial model is based on a wavelet tree signal decomposition and the modeling of the resulting sequence by means of a parametric probabilistic model, that can be situated within the family of models trainable via expectation maximization (hidden Markov tree model ). Our model is able to capture key characteristics of the source textures (water, rain, fire, applause, crowd chatter ), and faithfully reproduces some of the sound classes. In terms of a more general taxonomy of natural events proposed by Graver, we worked on models for natural event classification and segmentation. While the event labels comprise physical interactions between materials that do not have textural propierties in their enterity, those segmentation models can help in identifying textural portions of an audio recording useful for analysis and resynthesis. Following our work on concatenative synthesis of musical instruments, we have developed a pattern-based synthesis system, that allows to sonically explore a database of units by means of their representation in a perceptual feature space. Concatenative syntyhesis with “molecules” built from sparse atomic representations also allows capture low-level correlations in perceptual audio features, while facilitating the manipulation of textural sounds based on their physical and perceptual properties. We have approached the problem of sound texture modelling for synthesis from different directions, namely a low-level signal-theoretic point of view through a wavelet transform, and a more high-level point of view driven by perceptual audio features in the concatenative synthesis setting. The developed framework provides unified approach to the high-quality resynthesis of natural texture sounds. Our research is embedded within the Metaverse 1 European project (2008-2011), where our models are contributting as low level building blocks within a semi-automated soundscape generation system.
Resumo:
Construcció d'una aplicació web a partir de les especificacions d'un client imaginari. Estudi i utilització del mètode Rational Unified Process, el més habitual actualment en la construcció de software. Disseny d'una base de dades i implementació del model lògic mitjançant un SGBD punter al mercat com Oracle.
Resumo:
By using suitable parameters, we present a uni¯ed aproach for describing four methods for representing categorical data in a contingency table. These methods include:correspondence analysis (CA), the alternative approach using Hellinger distance (HD),the log-ratio (LR) alternative, which is appropriate for compositional data, and theso-called non-symmetrical correspondence analysis (NSCA). We then make an appropriate comparison among these four methods and some illustrative examples are given.Some approaches based on cumulative frequencies are also linked and studied usingmatrices.Key words: Correspondence analysis, Hellinger distance, Non-symmetrical correspondence analysis, log-ratio analysis, Taguchi inertia
Resumo:
Motivated by the work of Mateu, Orobitg, Pérez and Verdera, who proved inequalities of the form $T_*f\lesssim M(Tf)$ or $T_*f\lesssim M^2(Tf)$ for certain singular integral operators $T$, such as the Hilbert or the Beurling transforms, we study the possibility of establishing this type of control for the Cauchy transform along a Lipschitz graph. We show that this is not possible in general, and we give a partial positive result when the graph is substituted by a Jordan curve.
Resumo:
In a seminal paper, Aitchison and Lauder (1985) introduced classical kernel densityestimation techniques in the context of compositional data analysis. Indeed, they gavetwo options for the choice of the kernel to be used in the kernel estimator. One ofthese kernels is based on the use the alr transformation on the simplex SD jointly withthe normal distribution on RD-1. However, these authors themselves recognized thatthis method has some deficiencies. A method for overcoming these dificulties based onrecent developments for compositional data analysis and multivariate kernel estimationtheory, combining the ilr transformation with the use of the normal density with a fullbandwidth matrix, was recently proposed in Martín-Fernández, Chacón and Mateu-Figueras (2006). Here we present an extensive simulation study that compares bothmethods in practice, thus exploring the finite-sample behaviour of both estimators
Resumo:
A problem in the archaeometric classification of Catalan Renaissance pottery is the fact, thatthe clay supply of the pottery workshops was centrally organized by guilds, and thereforeusually all potters of a single production centre produced chemically similar ceramics.However, analysing the glazes of the ware usually a large number of inclusions in the glaze isfound, which reveal technological differences between single workshops. These inclusionshave been used by the potters in order to opacify the transparent glaze and to achieve a whitebackground for further decoration.In order to distinguish different technological preparation procedures of the single workshops,at a Scanning Electron Microscope the chemical composition of those inclusions as well astheir size in the two-dimensional cut is recorded. Based on the latter, a frequency distributionof the apparent diameters is estimated for each sample and type of inclusion.Following an approach by S.D. Wicksell (1925), it is principally possible to transform thedistributions of the apparent 2D-diameters back to those of the true three-dimensional bodies.The applicability of this approach and its practical problems are examined using differentways of kernel density estimation and Monte-Carlo tests of the methodology. Finally, it istested in how far the obtained frequency distributions can be used to classify the pottery
Resumo:
The optimization of the pilot overhead in single-user wireless fading channels is investigated, and the dependence of this overhead on various system parameters of interest (e.g., fading rate, signal-to-noise ratio) is quantified. The achievable pilot-based spectral efficiency is expanded with respect to the fading rate about the no-fading point, which leads to an accurate order expansion for the pilot overhead. This expansion identifies that the pilot overhead, as well as the spectral efficiency penalty with respect to a reference system with genie-aided CSI (channel state information) at the receiver, depend on the square root of the normalized Doppler frequency. It is also shown that the widely-used block fading model is a special case of more accurate continuous fading models in terms of the achievable pilot-based spectral efficiency. Furthermore, it is established that the overhead optimization for multiantenna systems is effectively the same as for single-antenna systems with the normalized Doppler frequency multiplied by the number of transmit antennas.
Resumo:
The optimization of the pilot overhead in wireless fading channels is investigated, and the dependence of this overhead on various system parameters of interest (e.g., fading rate, signal-to-noise ratio) is quantified. The achievable pilot-based spectral efficiency is expanded with respect to the fading rate about the no-fading point, which leads to an accurate order expansion for the pilot overhead. This expansion identifies that the pilot overhead, as well as the spectral efficiency penalty with respect to a reference system with genie-aided CSI (channel state information) at the receiver, depend on the square root of the normalized Doppler frequency. It is also shown that the widely-usedblock fading model is a special case of more accurate continuous fading models in terms of the achievable pilot-based spectral efficiency. Furthermore, it is established that the overhead optimization for multiantenna systems is effectively the same as for single-antenna systems with thenormalized Doppler frequency multiplied by the number of transmit antennas.
Resumo:
For the standard kernel density estimate, it is known that one can tune the bandwidth such that the expected L1 error is within a constant factor of the optimal L1 error (obtained when one is allowed to choose the bandwidth with knowledge of the density). In this paper, we pose the same problem for variable bandwidth kernel estimates where the bandwidths are allowed to depend upon the location. We show in particular that for positive kernels on the real line, for any data-based bandwidth, there exists a densityfor which the ratio of expected L1 error over optimal L1 error tends to infinity. Thus, the problem of tuning the variable bandwidth in an optimal manner is ``too hard''. Moreover, from the class of counterexamples exhibited in the paper, it appears thatplacing conditions on the densities (monotonicity, convexity, smoothness) does not help.
Resumo:
In the fixed design regression model, additional weights areconsidered for the Nadaraya--Watson and Gasser--M\"uller kernel estimators.We study their asymptotic behavior and the relationships between new andclassical estimators. For a simple family of weights, and considering theIMSE as global loss criterion, we show some possible theoretical advantages.An empirical study illustrates the performance of the weighted estimatorsin finite samples.
Resumo:
We present a unified geometric framework for describing both the Lagrangian and Hamiltonian formalisms of regular and non-regular time-dependent mechanical systems, which is based on the approach of Skinner and Rusk (1983). The dynamical equations of motion and their compatibility and consistency are carefully studied, making clear that all the characteristics of the Lagrangian and the Hamiltonian formalisms are recovered in this formulation. As an example, it is studied a semidiscretization of the nonlinear wave equation proving the applicability of the proposed formalism.