828 resultados para GRASP filtering
Resumo:
The paratext framework is now used in a variety of fields to assess, measure, analyze, and comprehend the elements that provide thresholds, allowing scholars to better understand digital objects. Researchers from many disciplines revisit paratextual theories in order to grasp what surrounds text in the digital age. Examining Paratextual Theory and its Applications in Digital Culture suggests a theoretical and practical tool for building bridges between disciplines interested in conducting joint research and exploration of digital culture. Helping scholars from different fields find an interdisciplinary framework and common language to study digital objects, this book serves as a useful reference for academics, librarians, professionals, researchers, and students, offering a collaborative outlook and perspective.
Resumo:
The paratext framework is now used in a variety of fields to assess, measure, analyze, and comprehend the elements that provide thresholds, allowing scholars to better understand digital objects. Researchers from many disciplines revisit paratextual theories in order to grasp what surrounds text in the digital age. Examining Paratextual Theory and its Applications in Digital Culture suggests a theoretical and practical tool for building bridges between disciplines interested in conducting joint research and exploration of digital culture. Helping scholars from different fields find an interdisciplinary framework and common language to study digital objects, this book serves as a useful reference for academics, librarians, professionals, researchers, and students, offering a collaborative outlook and perspective.
Resumo:
Cette thèse théorique porte sur la psychothérapie et en particulier sur deux formes - la psychanalyse et la thérapie brève de l'école de Palo Alto - qu'elle entend examiner dans le cadre de débats portant principalement sur les efforts métathéoriques pour penser la modernité, la postmodernité et les phénomènes qui les accompagnent: rationalisation, individualisation, scepticisme ou relativisme cognitif et moral. Il est proposé que la psychothérapie puisse être considérée, au-delà de ce qui a été dit sur le caractère essentiellement narcissique de cette pratique, comme une contribution à l’émancipation sociale en favorisant le développement moral des personnes. Il s’agit ici de montrer que l’on peut faire une autre lecture de cette réalité, et ce à l’aide de ressources fournies par la tradition sociologique. Ce développement moral des personnes serait favorisé par un fonctionnement autoréflexif et des compétences communicationnelles, ces dernières traduisant, dans la pensée d’Habermas, la conscience morale. Mais pour qu’il y ait fonctionnement autoréflexif, il faut pouvoir accepter une capacité à connaître, à se connaître, ce que n’admettent pas d’emblée les thérapies influencées par le postmodernisme. Or l’examen des discours tenus par les praticiens eux-mêmes sur leurs pratiques révèle une influence du postmodernisme, que ce soit sous la forme du constructivisme, du constructionnisme social ou plus généralement d’un certain scepticisme et d’un refus concomitant de l'expertise et de l'autorité, une situation paradoxale pour une pratique professionnelle. Les deux formes de thérapies retenues censées représenter les deux pôles de l'intervention thérapeutique - le pôle technique, stratégique et le pôle expressiviste, communicationnel – sont examinées à la lumière de propositions mises de l’avant par Habermas, notamment sur les rationalités stratégique et communicationnelle ainsi que la situation idéale de parole. La psychothérapie apparait ici comme une contribution inestimable à une rationalisation du monde vécu. Forte d’un approfondissement des notions de modernité et de postmodernisme, l’exploration se poursuit avec une critique détaillée d’ouvrages de Foucault portant sur les pratiques disciplinaires, la grande objection à concevoir les psychothérapies comme émancipatrices. La thèse tend à démontrer que ces analyses ne reflètent plus une situation contemporaine. Enfin, la thèse examine le débat entre Habermas et Foucault sous l'angle des rapports critique-pouvoir : si le savoir est toujours le produit de rapports de pouvoir et s’il a toujours des effets de pouvoir, comment peut-il prétendre être critique ? Il en ressort que l'œuvre d’Habermas, en plus de posséder beaucoup plus d'attributs susceptibles d'appuyer la problématique, offre une théorisation plus équilibrée, plus nuancée des gains liés à la modernité, tandis que Foucault, outre qu'il n'offre aucun espoir de progrès ou gains en rationalité, nous lègue une conception du pouvoir à la fois plus réaliste (il est imbriqué dans toute communication et toute interaction), mais plus fataliste, sans possibilité de rédemption par le savoir. La thèse se conclut par un retour sur la notion d’individualisme avec L. Dumont, Lipovetsky, Taylor, ainsi que Bellah et al. pour discuter des phénomènes sociaux liés, pour certains critiques, à l’existence des psychothérapies, notamment l’instrumentalité des relations.
Resumo:
Les références proviennent du Centre des archives diplomatiques de Nantes (France)
Resumo:
L'urbanisation représente une menace majeure pour la biodiversité. Ce mémoire de maîtrise vise à comprendre ses effets sur la composition fonctionnelle et l'homogénéisation biotique dans les forêts riveraines. Des inventaires floristiques ont été réalisés dans 57 forêts riveraines de la région de Montréal. Afin d'étudier la variation de la composition fonctionnelle avec l'urbanisation, des moyennes pondérées de traits par communauté ont été calculées pour les arbres, arbustes et herbacées. Chaque forêt a été caractérisée par des variables relatives au paysage urbain environnant, aux conditions locales des forêts et aux processus spatiaux. Les conditions locales, notamment les inondations, exerçaient une pression de sélection dominante sur les traits. L'effet du paysage était indirect, agissant via l'altération des régimes hydrologiques. La dispersion le long des rivières était aussi un processus important dans la structuration des forêts riveraines. Les changements dans la diversité β taxonomique et fonctionnelle des herbacées ont été étudiés entre trois niveaux d'urbanisation et d'inondation. Alors que l'urbanisation a favorisé une différenciation taxonomique, les inondations ont favorisé une homogénéisation taxonomique, sans influencer la diversité β fonctionnelle. L'urbanisation était l'élément déclencheur des changements de la diversité β, directement, en causant un gain en espèces exotiques et une diminution de la richesse totale dans les forêts très urbanisées, et, indirectement, en entraînant un important turnover d'espèces par l'altération des régimes hydrologiques. Globalement, ces résultats suggèrent que la modification des processus naturels par les activités anthropiques est le principal moteur de changements dans les communautés riveraines urbaines.
Resumo:
Ce mémoire a pour objectif de comprendre l’expérience de travail des agents œuvrant en réinsertion sociale auprès de condamnés provinciaux en collectivité au Québec. Plus précisément, cette recherche souhaite saisir le rôle exercé par ces agents, dans un contexte où ils exercent un double mandat de sécurité publique et de réinsertion sociale. L’étude tente aussi de mettre en lumière leurs pratiques de travail, inscrites dans une logique de gestion efficace des risques. Enfin, ce mémoire vise à comprendre la place qu’occupe la réinsertion sociale dans le cadre de leur travail. Pour ce faire, l’approche qualitative a permis de mener quinze (15) entretiens auprès d’agents de probation et d’intervenants issus du secteur communautaire responsables de la surveillance de justiciables provinciaux en collectivité. Deux (2) thèmes principaux émergent de ces entrevues. D’une part, Le travail est décrit par les participants en regard du double rôle exercé, des responsabilités légales et cliniques qui leur incombent, et de l’intervention centrée sur le risque et la réinsertion sociale auprès des contrevenants. D’autre part, Le contexte de travail réfère au partenariat établi entre les intervenants, au recours aux outils actuariels, ainsi qu’aux instances modulant leurs pratiques de travail : les médias, la Commission québécoise des libérations conditionnelles et les Services correctionnels du Québec. Il ressort de nos analyses que la sécurité publique par la gestion efficace des risques se manifeste par une forme de rationalisation des pratiques de travail et par l’intégration d’un rôle de contrôle. Il appert cependant que ces deux (2) aspects sont motivés avant tout par le désir de venir en aide à la population contrevenante. Il résulte finalement de l’étude que la réinsertion sociale ne constitue qu’un objectif de l’intervention parmi d’autres. Les participants doivent jongler avec ces diverses finalités afin de s’ajuster à l’acteur principal de leur travail : le contrevenant.
Resumo:
La synthèse d'images dites photoréalistes nécessite d'évaluer numériquement la manière dont la lumière et la matière interagissent physiquement, ce qui, malgré la puissance de calcul impressionnante dont nous bénéficions aujourd'hui et qui ne cesse d'augmenter, est encore bien loin de devenir une tâche triviale pour nos ordinateurs. Ceci est dû en majeure partie à la manière dont nous représentons les objets: afin de reproduire les interactions subtiles qui mènent à la perception du détail, il est nécessaire de modéliser des quantités phénoménales de géométries. Au moment du rendu, cette complexité conduit inexorablement à de lourdes requêtes d'entrées-sorties, qui, couplées à des évaluations d'opérateurs de filtrage complexes, rendent les temps de calcul nécessaires à produire des images sans défaut totalement déraisonnables. Afin de pallier ces limitations sous les contraintes actuelles, il est nécessaire de dériver une représentation multiéchelle de la matière. Dans cette thèse, nous construisons une telle représentation pour la matière dont l'interface correspond à une surface perturbée, une configuration qui se construit généralement via des cartes d'élévations en infographie. Nous dérivons notre représentation dans le contexte de la théorie des microfacettes (conçue à l'origine pour modéliser la réflectance de surfaces rugueuses), que nous présentons d'abord, puis augmentons en deux temps. Dans un premier temps, nous rendons la théorie applicable à travers plusieurs échelles d'observation en la généralisant aux statistiques de microfacettes décentrées. Dans l'autre, nous dérivons une procédure d'inversion capable de reconstruire les statistiques de microfacettes à partir de réponses de réflexion d'un matériau arbitraire dans les configurations de rétroréflexion. Nous montrons comment cette théorie augmentée peut être exploitée afin de dériver un opérateur général et efficace de rééchantillonnage approximatif de cartes d'élévations qui (a) préserve l'anisotropie du transport de la lumière pour n'importe quelle résolution, (b) peut être appliqué en amont du rendu et stocké dans des MIP maps afin de diminuer drastiquement le nombre de requêtes d'entrées-sorties, et (c) simplifie de manière considérable les opérations de filtrage par pixel, le tout conduisant à des temps de rendu plus courts. Afin de valider et démontrer l'efficacité de notre opérateur, nous synthétisons des images photoréalistes anticrenelées et les comparons à des images de référence. De plus, nous fournissons une implantation C++ complète tout au long de la dissertation afin de faciliter la reproduction des résultats obtenus. Nous concluons avec une discussion portant sur les limitations de notre approche, ainsi que sur les verrous restant à lever afin de dériver une représentation multiéchelle de la matière encore plus générale.
Resumo:
3-D assessment of scoliotic deformities relies on an accurate 3-D reconstruction of bone structures from biplanar X-rays, which requires a precise detection and matching of anatomical structures in both views. In this paper, we propose a novel semiautomated technique for detecting complete scoliotic rib borders from PA-0° and PA-20° chest radiographs, by using an edge-following approach with multiple-path branching and oriented filtering. Edge-following processes are initiated from user starting points along upper and lower rib edges and the final rib border is obtained by finding the most parallel pair among detected edges. The method is based on a perceptual analysis leading to the assumption that no matter how bent a scoliotic rib is, it will always present relatively parallel upper and lower edges. The proposed method was tested on 44 chest radiographs of scoliotic patients and was validated by comparing pixels from all detected rib borders against their reference locations taken from the associated manually delineated rib borders. The overall 2-D detection accuracy was 2.64 ± 1.21 pixels. Comparing this accuracy level to reported results in the literature shows that the proposed method is very well suited for precisely detecting borders of scoliotic ribs from PA-0° and PA-20° chest radiographs.
Resumo:
Neural Network has emerged as the topic of the day. The spectrum of its application is as wide as from ECG noise filtering to seismic data analysis and from elementary particle detection to electronic music composition. The focal point of the proposed work is an application of a massively parallel connectionist model network for detection of a sonar target. This task is segmented into: (i) generation of training patterns from sea noise that contains radiated noise of a target, for teaching the network;(ii) selection of suitable network topology and learning algorithm and (iii) training of the network and its subsequent testing where the network detects, in unknown patterns applied to it, the presence of the features it has already learned in. A three-layer perceptron using backpropagation learning is initially subjected to a recursive training with example patterns (derived from sea ambient noise with and without the radiated noise of a target). On every presentation, the error in the output of the network is propagated back and the weights and the bias associated with each neuron in the network are modified in proportion to this error measure. During this iterative process, the network converges and extracts the target features which get encoded into its generalized weights and biases.In every unknown pattern that the converged network subsequently confronts with, it searches for the features already learned and outputs an indication for their presence or absence. This capability for target detection is exhibited by the response of the network to various test patterns presented to it.Three network topologies are tried with two variants of backpropagation learning and a grading of the performance of each combination is subsequently made.
Resumo:
A new procedure for the classification of lower case English language characters is presented in this work . The character image is binarised and the binary image is further grouped into sixteen smaller areas ,called Cells . Each cell is assigned a name depending upon the contour present in the cell and occupancy of the image contour in the cell. A data reduction procedure called Filtering is adopted to eliminate undesirable redundant information for reducing complexity during further processing steps . The filtered data is fed into a primitive extractor where extraction of primitives is done . Syntactic methods are employed for the classification of the character . A decision tree is used for the interaction of the various components in the scheme . 1ike the primitive extraction and character recognition. A character is recognized by the primitive by primitive construction of its description . Openended inventories are used for including variants of the characters and also adding new members to the general class . Computer implementation of the proposal is discussed at the end using handwritten character samples . Results are analyzed and suggestions for future studies are made. The advantages of the proposal are discussed in detail .
Resumo:
The thesis has covered various aspects of modeling and analysis of finite mean time series with symmetric stable distributed innovations. Time series analysis based on Box and Jenkins methods are the most popular approaches where the models are linear and errors are Gaussian. We highlighted the limitations of classical time series analysis tools and explored some generalized tools and organized the approach parallel to the classical set up. In the present thesis we mainly studied the estimation and prediction of signal plus noise model. Here we assumed the signal and noise follow some models with symmetric stable innovations.We start the thesis with some motivating examples and application areas of alpha stable time series models. Classical time series analysis and corresponding theories based on finite variance models are extensively discussed in second chapter. We also surveyed the existing theories and methods correspond to infinite variance models in the same chapter. We present a linear filtering method for computing the filter weights assigned to the observation for estimating unobserved signal under general noisy environment in third chapter. Here we consider both the signal and the noise as stationary processes with infinite variance innovations. We derived semi infinite, double infinite and asymmetric signal extraction filters based on minimum dispersion criteria. Finite length filters based on Kalman-Levy filters are developed and identified the pattern of the filter weights. Simulation studies show that the proposed methods are competent enough in signal extraction for processes with infinite variance.Parameter estimation of autoregressive signals observed in a symmetric stable noise environment is discussed in fourth chapter. Here we used higher order Yule-Walker type estimation using auto-covariation function and exemplify the methods by simulation and application to Sea surface temperature data. We increased the number of Yule-Walker equations and proposed a ordinary least square estimate to the autoregressive parameters. Singularity problem of the auto-covariation matrix is addressed and derived a modified version of the Generalized Yule-Walker method using singular value decomposition.In fifth chapter of the thesis we introduced partial covariation function as a tool for stable time series analysis where covariance or partial covariance is ill defined. Asymptotic results of the partial auto-covariation is studied and its application in model identification of stable auto-regressive models are discussed. We generalize the Durbin-Levinson algorithm to include infinite variance models in terms of partial auto-covariation function and introduce a new information criteria for consistent order estimation of stable autoregressive model.In chapter six we explore the application of the techniques discussed in the previous chapter in signal processing. Frequency estimation of sinusoidal signal observed in symmetric stable noisy environment is discussed in this context. Here we introduced a parametric spectrum analysis and frequency estimate using power transfer function. Estimate of the power transfer function is obtained using the modified generalized Yule-Walker approach. Another important problem in statistical signal processing is to identify the number of sinusoidal components in an observed signal. We used a modified version of the proposed information criteria for this purpose.
Resumo:
The increasing tempo of construction activity the world over creates heavy pressure on existing land space. The quest for new and competent site often points to the needs for improving existing sites, which are otherwise deemed unsuitable for adopting conventional foundations. This is accomplished by ground improvement methods, which are employed to improve the quality of soil incompetent in their natural state. Among the construction activities, a well-connected road network is one of the basic infrastructure requirements, which play a vital role for the fast and comfortable movement of inter- regional traffic in countries like India.One of the innovative ground improvement techniques practised all over the world is the use of geosynthetics, which include geotextiles, geomembranes, geogrids, etc . They offer the advantages such as space saving, enviromnental sensitivity, material availability, technical superiority, higher cost savings, less construction time, etc . Because of its fundamental properties, such as tensile strength, filtering and water permeability, a geotextile inserted between the base material and sub grade can function as reinforcement, a filter medium, a separation layer and as a drainage medium. Though polymeric geotextiles are used in abundant quantities, the use of natural geotextiles (like coir, jute, etc.) has yet to get momentum. This is primarily due to the lack of research work on natural geotextilcs for ground improvement, particularly in the areas of un paved roads. Coir geotextiles are best suited for low cost applications because of its availability at low prices compared to its synthetic counterparts. The proper utilisation of coir geotextilcs in various applications demands large quantities of the product, which in turn can create a boom in the coir industry. The present study aims at exploring the possibilities of utilising coir geotextiles for unpaved roads and embankments.The properties of coir geotextiles used have been evaluated. The properties studied include mass per unit area, puncture resistance, tensile strength, secant modulus, etc . The interfacial friction between soils and three types of coir geotextiles used was also evaluated. It was found that though the parameters evaluated for coir geotextiles have low values compared to polymeric geotextiles, the former are sufficient for use in unpaved roads and embankments. The frictional characteristics of coir geotextile - soil interfaces are extremely good and satisfy the condition set by the International Geosynthetic Society for varied applications.The performance of coir geotextiles reinforced subgrade was studied by conducting California Bearing Ratio (CBR) tests. Studies were made with coir geotextiles placed at different levels and also in multiple layers. The results have shown that the coir geotextile enhances the subgrade strength. A regression analysis was perfonned and a mathematical model was developed to predict the CBR of the coir geotextile reinforced subgrade soil as a function of the soil properties, coir geotextile properties, and placement depth of reinforcement.The effects of coir geotextiles on bearing capacity were studied by perfonning plate load tests in a test tan1e This helped to understand the functioning of geotextile as reinforcement in unpaved roads and embankments. The perfonnance of different types of coir geotextiles with respect to the placement depth in dry and saturated conditions was studied. The results revealed that the bearing capacity of coir-reinforced soil is increasing irrespective of the type of coir geotextiles and saturation condition.The rut behaviour of unreinforced and coir reinforced unpaved road sections were compared by conducting model static load tests in a test tank and also under repetitive loads in a wheel track test facility. The results showed that coir geotextiles could fulfill the functions as reinforcement and as a separator, both under static and repetitive loads. The rut depth was very much reduced whik placing coir geotextiles in between sub grade and sub base.In order to study the use of Coir geotextiles in improving the settlement characteristics, two types of prefabricated COlf geotextile vertical drains were developed and their time - settlement behaviour were studied. Three different dispositions were tried. It was found that the coir geotextile drains were very effective in reducing consolidation time due to radial drainage. The circular drains in triangular disposition gave maximum beneficial effect.In long run, the degradation of coir geotextile is expected, which results in a soil - fibre matrix. Hence, studies pertaining to strength and compressibility characteristics of soil - coir fibre composites were conducted. Experiments were done using coir fibres having different aspect ratios and in different proportions. The results revealed that the strength of the soil was increased by 150% to 200% when mixed with 2% of fibre having approximately 12mm length, at all compaction conditions. Also, the coefficient of consolidation increased and compression index decreased with the addition of coir fibre.Typical design charts were prepared for the design of coir geotextile reinforced unpaved roads. Some illustrative examples are also given. The results demonstrated that a considerable saving in subase / base thickness can he achieved with the use of eoir geotextiles, which in turn, would save large quantities of natural aggregates.
Resumo:
Computational Biology is the research are that contributes to the analysis of biological data through the development of algorithms which will address significant research problems.The data from molecular biology includes DNA,RNA ,Protein and Gene expression data.Gene Expression Data provides the expression level of genes under different conditions.Gene expression is the process of transcribing the DNA sequence of a gene into mRNA sequences which in turn are later translated into proteins.The number of copies of mRNA produced is called the expression level of a gene.Gene expression data is organized in the form of a matrix. Rows in the matrix represent genes and columns in the matrix represent experimental conditions.Experimental conditions can be different tissue types or time points.Entries in the gene expression matrix are real values.Through the analysis of gene expression data it is possible to determine the behavioral patterns of genes such as similarity of their behavior,nature of their interaction,their respective contribution to the same pathways and so on. Similar expression patterns are exhibited by the genes participating in the same biological process.These patterns have immense relevance and application in bioinformatics and clinical research.Theses patterns are used in the medical domain for aid in more accurate diagnosis,prognosis,treatment planning.drug discovery and protein network analysis.To identify various patterns from gene expression data,data mining techniques are essential.Clustering is an important data mining technique for the analysis of gene expression data.To overcome the problems associated with clustering,biclustering is introduced.Biclustering refers to simultaneous clustering of both rows and columns of a data matrix. Clustering is a global whereas biclustering is a local model.Discovering local expression patterns is essential for identfying many genetic pathways that are not apparent otherwise.It is therefore necessary to move beyond the clustering paradigm towards developing approaches which are capable of discovering local patterns in gene expression data.A biclusters is a submatrix of the gene expression data matrix.The rows and columns in the submatrix need not be contiguous as in the gene expression data matrix.Biclusters are not disjoint.Computation of biclusters is costly because one will have to consider all the combinations of columans and rows in order to find out all the biclusters.The search space for the biclustering problem is 2 m+n where m and n are the number of genes and conditions respectively.Usually m+n is more than 3000.The biclustering problem is NP-hard.Biclustering is a powerful analytical tool for the biologist.The research reported in this thesis addresses the problem of biclustering.Ten algorithms are developed for the identification of coherent biclusters from gene expression data.All these algorithms are making use of a measure called mean squared residue to search for biclusters.The objective here is to identify the biclusters of maximum size with the mean squared residue lower than a given threshold. All these algorithms begin the search from tightly coregulated submatrices called the seeds.These seeds are generated by K-Means clustering algorithm.The algorithms developed can be classified as constraint based,greedy and metaheuristic.Constarint based algorithms uses one or more of the various constaints namely the MSR threshold and the MSR difference threshold.The greedy approach makes a locally optimal choice at each stage with the objective of finding the global optimum.In metaheuristic approaches particle Swarm Optimization(PSO) and variants of Greedy Randomized Adaptive Search Procedure(GRASP) are used for the identification of biclusters.These algorithms are implemented on the Yeast and Lymphoma datasets.Biologically relevant and statistically significant biclusters are identified by all these algorithms which are validated by Gene Ontology database.All these algorithms are compared with some other biclustering algorithms.Algorithms developed in this work overcome some of the problems associated with the already existing algorithms.With the help of some of the algorithms which are developed in this work biclusters with very high row variance,which is higher than the row variance of any other algorithm using mean squared residue, are identified from both Yeast and Lymphoma data sets.Such biclusters which make significant change in the expression level are highly relevant biologically.
Resumo:
This thesis Entitled Photonic applications of biomaterials with special reference to biopolymers and microbes. A detailed investigation will be presented in the present thesis related to direct applications of biopolymers into some selected area of photonics and how the growth kinetics of an aerial bacterial colony on solid agar media was studied using laser induced fluorescence technique. This chapter is an overview of the spectrum of biomaterials and their application to Photonics. The chapter discusses a wide range of biomaterials based photonics applications like efficient harvesting of solar energy, lowthreshold lasing, high-density data storage, optical switching, filtering and template for nano s tructures. The most extensively investigated photonics application in biology is Laser induced fluorescence technique. The importance of fluorescence studies in different biological and related fields are also mentioned in this chapter.
Resumo:
In a sigma-delta analog to digital (A/D) As most of the sigma-delta ADC applications require converter, the most computationally intensive block is decimation filters with linear phase characteristics, the decimation filter and its hardware implementation symmetric Finite Impulse Response (FIR) filters are may require millions of transistors. Since these widely used for implementation. But the number of FIR converters are now targeted for a portable application, filter coefficients will be quite large for implementing a a hardware efficient design is an implicit requirement. narrow band decimation filter. Implementing decimation In this effect, this paper presents a computationally filter in several stages reduces the total number of filter efficient polyphase implementation of non-recursive coefficients, and hence reduces the hardware complexity cascaded integrator comb (CIC) decimators for and power consumption [2]. Sigma-Delta Converters (SDCs). The SDCs are The first stage of decimation filter can be operating at high oversampling frequencies and hence implemented very efficiently using a cascade of integrators require large sampling rate conversions. The filtering and comb filters which do not require multiplication or and rate reduction are performed in several stages to coefficient storage. The remaining filtering is performed reduce hardware complexity and power dissipation. either in single stage or in two stages with more complex The CIC filters are widely adopted as the first stage of FIR or infinite impulse response (IIR) filters according to decimation due to its multiplier free structure. In this the requirements. The amount of passband aliasing or research, the performance of polyphase structure is imaging error can be brought within prescribed bounds by compared with the CICs using recursive and increasing the number of stages in the CIC filter. The non-recursive algorithms in terms of power, speed and width of the passband and the frequency characteristics area. This polyphase implementation offers high speed outside the passband are severely limited. So, CIC filters operation and low power consumption. The polyphase are used to make the transition between high and low implementation of 4th order CIC filter with a sampling rates. Conventional filters operating at low decimation factor of '64' and input word length of sampling rate are used to attain the required transition '4-bits' offers about 70% and 37% of power saving bandwidth and stopband attenuation. compared to the corresponding recursive and Several papers are available in literature that deals non-recursive implementations respectively. The same with different implementations of decimation filter polyphase CIC filter can operate about 7 times faster architecture for sigma-delta ADCs. Hogenauer has than the recursive and about 3.7 times faster than the described the design procedures for decimation and non-recursive CIC filters.