940 resultados para Unified transform
Resumo:
In this paper we investigate various algorithms for performing Fast Fourier Transformation (FFT)/Inverse Fast Fourier Transformation (IFFT), and proper techniquesfor maximizing the FFT/IFFT execution speed, such as pipelining or parallel processing, and use of memory structures with pre-computed values (look up tables -LUT) or other dedicated hardware components (usually multipliers). Furthermore, we discuss the optimal hardware architectures that best apply to various FFT/IFFT algorithms, along with their abilities to exploit parallel processing with minimal data dependences of the FFT/IFFT calculations. An interesting approach that is also considered in this paper is the application of the integrated processing-in-memory Intelligent RAM (IRAM) chip to high speed FFT/IFFT computing. The results of the assessment study emphasize that the execution speed of the FFT/IFFT algorithms is tightly connected to the capabilities of the FFT/IFFT hardware to support the provided parallelism of the given algorithm. Therefore, we suggest that the basic Discrete Fourier Transform (DFT)/Inverse Discrete Fourier Transform (IDFT) can also provide high performances, by utilizing a specialized FFT/IFFT hardware architecture that can exploit the provided parallelism of the DFT/IDF operations. The proposed improvements include simplified multiplications over symbols given in polar coordinate system, using sinе and cosine look up tables,and an approach for performing parallel addition of N input symbols.
Resumo:
Rotation distance quantifies the difference in shape between two rooted binary trees of the same size by counting the minimum number of elementary changes needed to transform one tree to the other. We describe several types of rotation distance, and provide upper bounds on distances between trees with a fixed number of nodes with respect to each type. These bounds are obtained by relating each restricted rotation distance to the word length of elements of Thompson's group F with respect to different generating sets, including both finite and infinite generating sets.
Resumo:
The goal of this paper is to develop a model of financial intermediation analyze the impact of various forms of taxation. The model considers in a unified framework various functions of banks: monitoring, transaction services and asset transformation. Particular attention is devoted to conditions for separability between deposits and loans. The analysis focuses on: (i) competition between banks and alternative financial arrangements (investment funds and organized security markets), (ii) regulation, and (iii) bank's monopoly power and risk taking behavior.
Resumo:
Consider a Riemannian manifold equipped with an infinitesimal isometry. For this setup, a unified treatment is provided, solely in the language of Riemannian geometry, of techniques in reduction, linearization, and stability of relative equilibria. In particular, for mechanical control systems, an explicit characterization is given for the manner in which reduction by an infinitesimal isometry, and linearization along a controlled trajectory "commute." As part of the development, relationships are derived between the Jacobi equation of geodesic variation and concepts from reduction theory, such as the curvature of the mechanical connection and the effective potential. As an application of our techniques, fiber and base stability of relative equilibria are studied. The paper also serves as a tutorial of Riemannian geometric methods applicable in the intersection of mechanics and control theory.
Resumo:
In this paper the electoral consequences of the Islamist terrorist attacks on March 11, 2004 are analysed. According to a quantitative analysis based on a post-electoral survey, we show the causal mechanisms that transform voters’ reactions to the bombings into a particular electoral behaviour and estimate their relevance in the electoral results on March 14, 2004
Resumo:
La "Phoronomia", primer libro de mecánica escrito tras los "Principia", es representativo del proceso de transición que transformó la dinámica a principios del XVIII y que concluye con la "Mecánica" de Euler (1736). Está escrita en estilo geométrico y algebraico, y mezcla los conceptos y métodos de Leibniz y Newton de forma idiosincrásica. En esta obra se encuentra por primera vez la segunda ley de Newton escrita en la forma en que hoy la conocemos, así como un intento de construcción de la estática y la dinámica de sólidos y fluidos basado en reglas generales diferenciales.
Resumo:
Macroeconomic activity has become less volatile over the past three decades in most G7 economies. Current literature focuses on the characterization of the volatility reduction and explanations for this so called "moderation" in each G7 economy separately. In opposed to individual country analysis and individual variable analysis, this paper focuses on common characteristics of the reduction and common explanations for the moderation in G7 countries. In particular, we study three explanations: structural changes in the economy, changes in common international shocks and changes in domestic shocks. We study these explanations in a unified model structure. To this end, we propose a Bayesian factor structural vector autoregressive model. Using the proposed model, we investigate whether we can find common explanations for all G7 economies when information is pooled from multiple domestic and international sources. Our empirical analysis suggests that volatility reductions can largely be attributed to the decline in the magnitudes of the shocks in most G7 countries while only for the U.K., the U.S. and Italy they can partially be attributed to structural changes in the economy. Analyzing the components of the volatility, we also find that domestic shocks rather than common international shocks can account for a large part of the volatility reduction in most of the G7 countries. Finally, we find that after mid-1980s the structure of the economy changes substantially in five of the G7 countries: Germany, Italy, Japan, the U.K. and the U.S..
Resumo:
JPEG 2000 és un estàndard de compressió d'imatges que utilitza tècniques estat de l’art basades en la transformada wavelet. Els principals avantatges són la millor compressió, la possibilitat d’operar amb dades comprimides i que es pot comprimir amb i sense pèrdua amb el mateix mètode. BOI és la implementació de JPEG 2000 del Grup de Compressió Interactiva d’Imatges del departament d’Enginyeria de la Informació i les Comunicacions, pensada per entendre, criticar i millorar les tecnologies de JPEG 2000. La nova versió intenta arribar a tots els extrems de l’estàndard on la versió anterior no va arribar.
Resumo:
Within the last few years, several reports have revealed that cell transplantation can be an effective way to replace lost neurons in the central nervous system (CNS) of patients affected with neurodegenerative diseases. Concerning the retina, the concept that newborn photoreceptors can integrate the retina and restore some visual functions was univocally demonstrated recently in the mouse eye (MacLaren et al. 2006) and remains to be achieved in human. These results pave the way to a standard approach in regenerative medicine aiming to replace lost photoreceptors. With the discovery of stem cells a great hope has appeared towards elaborating protocols to generate adequate cells to restore visual function in different retinal degeneration processes. Retinal stem cells (RSCs) are good candidates to repair the retina and are present throughout the retina development, including adulthood. However, neonatal mouse RSCs derived from the radial glia population have a different potential to proliferate and differentiate in comparison to adult RSCs. Moreover, we observed that adult mouse RSCs, depending on the culture conditions, have a marked tendency to transform, whereas neonatal RSCs show subtle chromosome abnormalities only after extensive expansion. These characteristics should help to identify the optimal cell source and culture conditions for cell transplantation studies. These results will be discussed in light of other studies using RSCs as well as embryonic stem cells. Another important factor to consider is the host environment, which plays a crucial role for cell integration and which was poorly studied in the normal and the diseased retina. Nonetheless, important results were recently generated to reconsider cell transplantation strategy. Perspectives to enhance cell integration by manipulating the environment will also be presented.
Resumo:
Les factoritzacions de la FFT (Fast Fourier Transform) que presenten un patró d’interconnexió regular entre factors o etapes son conegudes com algorismes paral·lels, o algorismes de Pease, ja que foren originalment proposats per Pease. En aquesta contribució s’han desenvolupat noves factoritzacions amb blocs que presenten el patró d’interconnexió regular de Pease. S’ha mostrat com aquests blocs poden ser obtinguts a una escala prèviament seleccionada. Les noves factoritzacions per ambdues FFT i IFFT (Inverse FFT) tenen dues classes de factors: uns pocs factors del tipus Cooley-Tukey i els nous factors que proporcionen la mateix patró d’interconnexió de Pease en blocs. Per a una factorització donada, els blocs comparteixen dimensions, el patró d’interconnexió etapa a etapa i a més cada un d’ells pot ser calculat independentment dels altres.
Resumo:
Nonstructural protein 4B (NS4B) plays an essential role in the formation of the hepatitis C virus (HCV) replication complex. It is a relatively poorly characterized integral membrane protein predicted to comprise four transmembrane segments in its central portion. Here, we describe a novel determinant for membrane association represented by amino acids (aa) 40 to 69 in the N-terminal portion of NS4B. This segment was sufficient to target and tightly anchor the green fluorescent protein to cellular membranes, as assessed by fluorescence microscopy as well as membrane extraction and flotation analyses. Circular dichroism and nuclear magnetic resonance structural analyses showed that this segment comprises an amphipathic alpha-helix extending from aa 42 to 66. Attenuated total reflection infrared spectroscopy and glycosylation acceptor site tagging revealed that this amphipathic alpha-helix has the potential to traverse the phospholipid bilayer as a transmembrane segment, likely upon oligomerization. Alanine substitution of the fully conserved aromatic residues on the hydrophobic helix side abrogated membrane association of the segment comprising aa 40 to 69 and disrupted the formation of a functional replication complex. These results provide the first atomic resolution structure of an essential membrane-associated determinant of HCV NS4B.
Resumo:
RESUM En aquest document es presenta un detector de contorns d’imatges basat en el domini transformat. A partir de la interpretació de la transformada de Fourier de la imatge i la seva formulació matricial en termes dels diferents modes, es realitza una selecció de les components passa baixes a partir de les quals es reconstrueix la component de baixa freqüència que es resta de la imatge original per tal d’obtenir el detector. Aquest detector de contorns no és esbiaixat. L’algorisme pot ser aplicat utilitzant diferents mides del bloc de processament, que pot anar de la imatge sencera a blocs de reduïdes dimensions: 36X36, 16x16 o 8x8, per fer un seguiment de les propietats locals de la imatge quan aquesta és presenta característiques espacials poc uniformes.
Resumo:
The approaches and opinions of economists often dominate public policy discussion. Economists have gained this privileged position partly (or perhaps mainly) because of the obvious relevance of their subject matter, but also because of the unified methodology (neo-classical economics) that the vast majority of modern economists bring to their analysis of policy problems and proposed solutions. The idea of Pareto efficiency and its potential trade-off with equity is a central idea that is understood by all economists and this common language provides the economics profession with a powerful voice in public affairs. The purpose of this paper is to review and reflect upon the way in which economists find themselves analysing and providing suggestions for social improvements and how this role has changed over roughly the last 60 years. We focus on the fundamental split in the public economics tradition between those that adhere to public finance and those that adhere to public choice. A pure public finance perspective views failures in society as failures of the market. The solutions are technical, as might be enacted by a benevolent dictator. The pure public choice view accepts (sometimes grudgingly) that markets may fail, but so, it insists, does politics. This signals institutional reforms to constrain the potential for political failure. Certain policy recommendations may be viewed as compatible with both traditions, but other policy proposals will be the opposite of that proposed within the other tradition. In recent years a political economics synthesis emerged. This accepts that institutions are very important and governments require constraints, but that some degree of benevolence on the part of policy makers should not be assumed non-existent. The implications for public policy from this approach are, however, much less clear and perhaps more piecemeal. We also discuss analyses of systematic failure, not so much on the part of markets or politicians, but by voters. Most clearly this could lead to populism and relaxing the idea that voters necessarily choose their interests. The implications for public policy are addressed. Throughout the paper we will relate the discussion to the experience of UK government policy-making.
Resumo:
The effect of mortality reductions on fertility is one of the main mechanisms stressed by the recent growth literature in order to explain demographic transitions. We analyze the empirical relevance of this mechanism based on the experience of all countries since 1960. We distinguish between the effects on gross and net fertility, take into account the dynamic nature of the relationship and control for alternative explanatory factors and for endogeneity. Our results show that mortality plays a large role in fertility reductions, that the change in fertility behavior comes with a lag of about 10 years and that both net and gross fertility are affected. We find comparatively little support for explanations of the demographic transition based on economic development or technological change.
Resumo:
We give a unified solution the conjugacy problem in Thompson’s groups F, V , and T using strand diagrams, and we analyze the complexity of the resulting algorithms.