224 resultados para convolution


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method for simulation of acoustical bores, useful in the context of sound synthesis by physical modeling of woodwind instruments, is presented. As with previously developed methods, such as digital waveguide modeling (DWM) [Smith, Comput. Music J. 16, pp 74-91 (1992)] and the multi convolution algorithm (MCA) [Martinez et al., J. Acoust. Soc. Am. 84, pp 1620-1627 (1988)], the approach is based on a one-dimensional model of wave propagation in the bore. Both the DWM method and the MCA explicitly compute the transmission and reflection of wave variables that represent actual traveling pressure waves. The method presented in this report, the wave digital modeling (WDM) method, avoids the typical limitations associated with these methods by using a more general definition of the wave variables. An efficient and spatially modular discrete-time model is constructed from the digital representations of elemental bore units such as cylindrical sections, conical sections, and toneholes. Frequency-dependent phenomena, such as boundary losses, are approximated with digital filters. The stability of a simulation of a complete acoustic bore is investigated empirically. Results of the simulation of a full clarinet show that a very good concordance with classic transmission-line theory is obtained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is shown how the fractional probability density diffusion equation for the diffusion limit of one-dimensional continuous time random walks may be derived from a generalized Markovian Chapman-Kolmogorov equation. The non-Markovian behaviour is incorporated into the Markovian Chapman-Kolmogorov equation by postulating a Levy like distribution of waiting times as a kernel. The Chapman-Kolmogorov equation so generalised then takes on the form of a convolution integral. The dependence on the initial conditions typical of a non-Markovian process is treated by adding a time dependent term involving the survival probability to the convolution integral. In the diffusion limit these two assumptions about the past history of the process are sufficient to reproduce anomalous diffusion and relaxation behaviour of the Cole-Cole type. The Green function in the diffusion limit is calculated using the fact that the characteristic function is the Mittag-Leffler function. Fourier inversion of the characteristic function yields the Green function in terms of a Wright function. The moments of the distribution function are evaluated from the Mittag-Leffler function using the properties of characteristic functions and a relation between the powers of the second moment and higher order even moments is derived. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In dielectronic recombination of hydrogenlike ions an intermediate doubly excited heliumlike ion is formed. Since the K shell is empty, both excited electrons can decay sequentially to the ground state. In this paper we analyze the x-ray radiation emitted from doubly and singly excited heliumlike titanium ions produced inside the Tokyo electron beam ion trap. Theoretical population densities of the singly excited states after the first transition and the transition probabilities of these states into the ground state were also calculated. This allowed theoretical branching ratios to be determined for each manifold. These branching ratios are compared to the experimentally obtained x-ray distribution by fitting across the relevant peak using a convolution of the theoretically obtained resonance strengths and energies. By taking into account 2E1 transitions which are not observed in the experiment, the measured and calculated ratios agree well. This method provides a valuable insight into the transition dynamics of excited highly charged ions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Methods of measuring the acoustic behavior of tubular systems can be broadly characterized as steady state measurements, where the measured signals are analyzed in terms of infinite duration sinusoids, and reflectometry measurements which exploit causality to separate the forward and backward going waves in a duct. This paper sets out a multiple microphone reflectometry technique which performs wave separation by using time domain convolution to track the forward and backward going waves in a cylindrical source tube. The current work uses two calibration runs (one for forward going waves and one for backward going waves) to measure the time domain transfer functions for each pair of microphones. These time domain transfer functions encode the time delay, frequency dependent losses and microphone gain ratios for travel between microphones. This approach is applied to the measurement of wave separation, bore profile and input impedance. The work differs from existing frequency domain methods in that it combines the information of multiple microphones within a time domain algorithm, and differs from existing time domain methods in its inclusion of the effect of losses and gain ratios in intermicrophone transfer functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is shown that, when expressing arguments in terms of their logarithms, the Laplace transform of a function is related to the antiderivative of this function by a simple convolution. This allows efficient numerical computations of moment generating functions of positive random variables and their inversion. The application of the method is straightforward, apart from the necessity to implement it using high-precision arithmetics. In numerical examples the approach is demonstrated to be particularly useful for distributions with heavy tails, Such as lognormal, Weibull, or Pareto distributions, which are otherwise difficult to handle. The computational efficiency compared to other methods is demonstrated for an M/G/1 queueing problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Orthogonal frequency division multiplexing (OFDM) requires an expensive linear amplifier at the transmitter due to its high peak-to-average power ratio (PAPR). Single carrier with cyclic prefix (SC-CP) is a closely related transmission scheme that possesses most of the benefits of OFDM but does not have the PAPR problem. Although in a multipath environment, SC-CP is very robust to frequency-selective fading, it is sensitive to the time-selective fading characteristics of the wireless channel that disturbs the orthogonality of the channel matrix (CM) and increases the computational complexity of the receiver. In this paper, we propose a time-domain low-complexity iterative algorithm to compensate for the effects of time selectivity of the channel that exploits the sparsity present in the channel convolution matrix. Simulation results show the superior performance of the proposed algorithm over the standard linear minimum mean-square error (L-MMSE) equalizer for SC-CP.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes the design, application, and evaluation of a user friendly, flexible, scalable and inexpensive Advanced Educational Parallel (AdEPar) digital signal processing (DSP) system based on TMS320C25 digital processors to implement DSP algorithms. This system will be used in the DSP laboratory by graduate students to work on advanced topics such as developing parallel DSP algorithms. The graduating senior students who have gained some experience in DSP can also use the system. The DSP laboratory has proved to be a useful tool in the hands of the instructor to teach the mathematically oriented topics of DSP that are often difficult for students to grasp. The DSP laboratory with assigned projects has greatly improved the ability of the students to understand such complex topics as the fast Fourier transform algorithm, linear and circular convolution, the theory and design of infinite impulse response (IIR) and finite impulse response (FIR) filters. The user friendly PC software support of the AdEPar system makes it easy to develop DSP programs for students. This paper gives the architecture of the AdEPar DSP system. The communication between processors and the PC-DSP processor communication are explained. The parallel debugger kernels and the restrictions of the system are described. The programming in the AdEPar is explained, and two benchmarks (parallel FFT and DES) are presented to show the system performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The prediction and management of ecosystem responses to global environmental change would profit from a clearer understanding of the mechanisms determining the structure and dynamics of ecological communities. The analytic theory presented here develops a causally closed picture for the mechanisms controlling community and population size structure, in particular community size spectra, and their dynamic responses to perturbations, with emphasis on marine ecosystems. Important implications are summarised in non-technical form. These include the identification of three different responses of community size spectra to size-specific pressures (of which one is the classical trophic cascade), an explanation for the observed slow recovery of fish communities from exploitation, and clarification of the mechanism controlling predation mortality rates. The theory builds on a community model that describes trophic interactions among size-structured populations and explicitly represents the full life cycles of species. An approximate time-dependent analytic solution of the model is obtained by coarse graining over maturation body sizes to obtain a simple description of the model steady state, linearising near the steady state, and then eliminating intraspecific size structure by means of the quasi-neutral approximation. The result is a convolution equation for trophic interactions among species of different maturation body sizes, which is solved analytically using a novel technique based on a multiscale expansion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A systolic array is an array of individual processing cells each of which has some local memory and is connected only to its nearest neighbours in the form of a regular lattice. On each cycle of a simple clock every cell receives data from its neighbouring cells and performs a specific processing operation on it. The resulting data is stored within the cell and passed on to neighbouring cells on the next clock cycle. This paper gives an overview of work to date and illustrates the application of bit-level systolic arrays by means of two examples: (1) a pipelined bit-slice circuit for computing matrix x vector transforms; and (2) a bit serial structure for multi-bit convolution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gene flow in macroalgal populations can be strongly influenced by spore or gamete dispersal. This, in turn, is influenced by a convolution of the effects of current flow and specific plant reproductive strategies. Although several studies have demonstrated genetic variability in macroalgal populations over a wide range of spatial scales, the associated current data have generally been poorly resolved spatially and temporally. In this study, we used a combination of population genetic analyses and high-resolution hydrodynamic modelling to investigate potential connectivity between populations of the kelp Laminaria digitata in the Strangford Narrows, a narrow channel characterized by strong currents linking the large semi-enclosed sea lough, Strangford Lough, to the Irish Sea. Levels of genetic structuring based on six microsatellite markers were very low, indicating high levels of gene flow and a pattern of isolation-by-distance, where populations are more likely to exchange migrants with geographically proximal populations, but with occasional long-distance dispersal. This was confirmed by the particle tracking model, which showed that, while the majority of spores settle near the release site, there is potential for dispersal over several kilometres. This combined population genetic and modelling approach suggests that the complex hydrodynamic environment at the entrance to Strangford Lough can facilitate dispersal on a scale exceeding that proposed for L. digitata in particular, and the majority of macroalgae in general. The study demonstrates the potential of integrated physical–biological approaches for the prediction of ecological changes resulting from factors such as anthropogenically induced coastal zone changes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The production of color/flavor compounds in wine is the result of different interrelated mechanism reactions. Among these, the oxidation phenomenon and the Maillard reaction stands out with particular relevance due to their large impact on the sensory quality of wines and consequently on the product shelflife. The aim of this thesis is to achieve a global vision of wine degradation mechanisms. The identification of mediators’ reactions involved in oxidative browning and aromatic degradation will be attempted based on different detectors. Two approaches are implemented in this work: a “non-target” approach by which relevant analytical tools will be used to merge the information of cyclic voltammetry and Diode-Array (DAD) detectors, allowing a broader overview of the system and the note of interesting compounds, and a “target” approach by which the identification and quantification of the different compounds related to the wine degradation process will be performed using different detectors (HPLC-UV/Vis, LC-MS, GC-MS, and FID). Two different patterns of degradation will be used in this study: wines generated by O2 and temperature perturbations, and synthetic solutions with relevant wine constituents for mechanisms validation. Results clearly demonstrate a “convolution” of chemical mechanisms. The presence of oxygen combined with temperature had a synergistic effect on the formation of several key odorant compounds.The results of this work could be translated to the wine-making and wine-storage environment from the modelling of the analysed compounds.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RESUMO: O cancro de mama e o mais frequente diagnoticado a indiv duos do sexo feminino. O conhecimento cientifico e a tecnologia tem permitido a cria ção de muitas e diferentes estrat egias para tratar esta patologia. A Radioterapia (RT) est a entre as diretrizes atuais para a maioria dos tratamentos de cancro de mama. No entanto, a radia ção e como uma arma de dois canos: apesar de tratar, pode ser indutora de neoplasias secund arias. A mama contralateral (CLB) e um orgão susceptivel de absorver doses com o tratamento da outra mama, potenciando o risco de desenvolver um tumor secund ario. Nos departamentos de radioterapia tem sido implementadas novas tecnicas relacionadas com a radia ção, com complexas estrat egias de administra ção da dose e resultados promissores. No entanto, algumas questões precisam de ser devidamente colocadas, tais como: E seguro avançar para tecnicas complexas para obter melhores indices de conformidade nos volumes alvo, em radioterapia de mama? O que acontece aos volumes alvo e aos tecidos saudaveis adjacentes? Quão exata e a administração de dose? Quais são as limitações e vantagens das técnicas e algoritmos atualmente usados? A resposta a estas questões e conseguida recorrendo a m etodos de Monte Carlo para modelar com precisão os diferentes componentes do equipamento produtor de radia ção(alvos, ltros, colimadores, etc), a m de obter uma descri cão apropriada dos campos de radia cão usados, bem como uma representa ção geometrica detalhada e a composição dos materiais que constituem os orgãos e os tecidos envolvidos. Este trabalho visa investigar o impacto de tratar cancro de mama esquerda usando diferentes tecnicas de radioterapia f-IMRT (intensidade modulada por planeamento direto), IMRT por planeamento inverso (IMRT2, usando 2 feixes; IMRT5, com 5 feixes) e DCART (arco conformacional dinamico) e os seus impactos em irradia ção da mama e na irradia ção indesejada dos tecidos saud aveis adjacentes. Dois algoritmos do sistema de planeamento iPlan da BrainLAB foram usados: Pencil Beam Convolution (PBC) e Monte Carlo comercial iMC. Foi ainda usado um modelo de Monte Carlo criado para o acelerador usado (Trilogy da VARIAN Medical Systems), no c odigo EGSnrc MC, para determinar as doses depositadas na mama contralateral. Para atingir este objetivo foi necess ario modelar o novo colimador multi-laminas High- De nition que nunca antes havia sido simulado. O modelo desenvolvido est a agora disponí vel no pacote do c odigo EGSnrc MC do National Research Council Canada (NRC). O acelerador simulado foi validado com medidas realizadas em agua e posteriormente com c alculos realizados no sistema de planeamento (TPS).As distribui ções de dose no volume alvo (PTV) e a dose nos orgãos de risco (OAR) foram comparadas atrav es da an alise de histogramas de dose-volume; an alise estati stica complementar foi realizadas usando o software IBM SPSS v20. Para o algoritmo PBC, todas as tecnicas proporcionaram uma cobertura adequada do PTV. No entanto, foram encontradas diferen cas estatisticamente significativas entre as t ecnicas, no PTV, nos OAR e ainda no padrão da distribui ção de dose pelos tecidos sãos. IMRT5 e DCART contribuem para maior dispersão de doses baixas pelos tecidos normais, mama direita, pulmão direito, cora cão e at e pelo pulmão esquerdo, quando comparados com as tecnicas tangenciais (f-IMRT e IMRT2). No entanto, os planos de IMRT5 melhoram a distribuição de dose no PTV apresentando melhor conformidade e homogeneidade no volume alvo e percentagens de dose mais baixas nos orgãos do mesmo lado. A t ecnica de DCART não apresenta vantagens comparativamente com as restantes t ecnicas investigadas. Foram tamb em identi cadas diferen cas entre os algoritmos de c alculos: em geral, o PBC estimou doses mais elevadas para o PTV, pulmão esquerdo e cora ção, do que os algoritmos de MC. Os algoritmos de MC, entre si, apresentaram resultados semelhantes (com dferen cas at e 2%). Considera-se que o PBC não e preciso na determina ção de dose em meios homog eneos e na região de build-up. Nesse sentido, atualmente na cl nica, a equipa da F sica realiza medi ções para adquirir dados para outro algoritmo de c alculo. Apesar de melhor homogeneidade e conformidade no PTV considera-se que h a um aumento de risco de cancro na mama contralateral quando se utilizam t ecnicas não-tangenciais. Os resultados globais dos estudos apresentados confirmam o excelente poder de previsão com precisão na determinação e c alculo das distribui ções de dose nos orgãos e tecidos das tecnicas de simulação de Monte Carlo usados.---------ABSTRACT:Breast cancer is the most frequent in women. Scienti c knowledge and technology have created many and di erent strategies to treat this pathology. Radiotherapy (RT) is in the actual standard guidelines for most of breast cancer treatments. However, radiation is a two-sword weapon: although it may heal cancer, it may also induce secondary cancer. The contralateral breast (CLB) is a susceptible organ to absorb doses with the treatment of the other breast, being at signi cant risk to develop a secondary tumor. New radiation related techniques, with more complex delivery strategies and promising results are being implemented and used in radiotherapy departments. However some questions have to be properly addressed, such as: Is it safe to move to complex techniques to achieve better conformation in the target volumes, in breast radiotherapy? What happens to the target volumes and surrounding healthy tissues? How accurate is dose delivery? What are the shortcomings and limitations of currently used treatment planning systems (TPS)? The answers to these questions largely rely in the use of Monte Carlo (MC) simulations using state-of-the-art computer programs to accurately model the di erent components of the equipment (target, lters, collimators, etc.) and obtain an adequate description of the radiation elds used, as well as the detailed geometric representation and material composition of organs and tissues. This work aims at investigating the impact of treating left breast cancer using di erent radiation therapy (RT) techniques f-IMRT (forwardly-planned intensity-modulated), inversely-planned IMRT (IMRT2, using 2 beams; IMRT5, using 5 beams) and dynamic conformal arc (DCART) RT and their e ects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB TPS were used: Pencil Beam Convolution (PBC)and commercial Monte Carlo (iMC). Furthermore, an accurate Monte Carlo (MC) model of the linear accelerator used (a Trilogy R VARIANR) was done with the EGSnrc MC code, to accurately determine the doses that reach the CLB. For this purpose it was necessary to model the new High De nition multileaf collimator that had never before been simulated. The model developed was then included on the EGSnrc MC package of National Research Council Canada (NRC). The linac was benchmarked with water measurements and later on validated against the TPS calculations. The dose distributions in the planning target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose-volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all the techniques provided adequate coverage of the PTV. However, statistically significant dose di erences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung, heart and even the left lung than tangential techniques (f-IMRT and IMRT2). However,IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Di erences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the MC algorithms predicted. The MC algorithms presented similar results (within 2% di erences). The PBC algorithm was considered not accurate in determining the dose in heterogeneous media and in build-up regions. Therefore, a major e ort is being done at the clinic to acquire data to move from PBC to another calculation algorithm. Despite better PTV homogeneity and conformity there is an increased risk of CLB cancer development, when using non-tangential techniques. The overall results of the studies performed con rm the outstanding predictive power and accuracy in the assessment and calculation of dose distributions in organs and tissues rendered possible by the utilization and implementation of MC simulation techniques in RT TPS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Digital Terrain Models (DTMs) are important in geology and geomorphology, since elevation data contains a lot of information pertaining to geomorphological processes that influence the topography. The first derivative of topography is attitude; the second is curvature. GIS tools were developed for derivation of strike, dip, curvature and curvature orientation from Digital Elevation Models (DEMs). A method for displaying both strike and dip simultaneously as colour-coded visualization (AVA) was implemented. A plug-in for calculating strike and dip via Least Squares Regression was created first using VB.NET. Further research produced a more computationally efficient solution, convolution filtering, which was implemented as Python scripts. These scripts were also used for calculation of curvature and curvature orientation. The application of these tools was demonstrated by performing morphometric studies on datasets from Earth and Mars. The tools show promise, however more work is needed to explore their full potential and possible uses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Réalisé en cotutelle avec l'Université Bordeaux 1 (France)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les titres financiers sont souvent modélisés par des équations différentielles stochastiques (ÉDS). Ces équations peuvent décrire le comportement de l'actif, et aussi parfois certains paramètres du modèle. Par exemple, le modèle de Heston (1993), qui s'inscrit dans la catégorie des modèles à volatilité stochastique, décrit le comportement de l'actif et de la variance de ce dernier. Le modèle de Heston est très intéressant puisqu'il admet des formules semi-analytiques pour certains produits dérivés, ainsi qu'un certain réalisme. Cependant, la plupart des algorithmes de simulation pour ce modèle font face à quelques problèmes lorsque la condition de Feller (1951) n'est pas respectée. Dans ce mémoire, nous introduisons trois nouveaux algorithmes de simulation pour le modèle de Heston. Ces nouveaux algorithmes visent à accélérer le célèbre algorithme de Broadie et Kaya (2006); pour ce faire, nous utiliserons, entre autres, des méthodes de Monte Carlo par chaînes de Markov (MCMC) et des approximations. Dans le premier algorithme, nous modifions la seconde étape de la méthode de Broadie et Kaya afin de l'accélérer. Alors, au lieu d'utiliser la méthode de Newton du second ordre et l'approche d'inversion, nous utilisons l'algorithme de Metropolis-Hastings (voir Hastings (1970)). Le second algorithme est une amélioration du premier. Au lieu d'utiliser la vraie densité de la variance intégrée, nous utilisons l'approximation de Smith (2007). Cette amélioration diminue la dimension de l'équation caractéristique et accélère l'algorithme. Notre dernier algorithme n'est pas basé sur une méthode MCMC. Cependant, nous essayons toujours d'accélérer la seconde étape de la méthode de Broadie et Kaya (2006). Afin de réussir ceci, nous utilisons une variable aléatoire gamma dont les moments sont appariés à la vraie variable aléatoire de la variance intégrée par rapport au temps. Selon Stewart et al. (2007), il est possible d'approximer une convolution de variables aléatoires gamma (qui ressemble beaucoup à la représentation donnée par Glasserman et Kim (2008) si le pas de temps est petit) par une simple variable aléatoire gamma.