78 resultados para sintáticos
Resumo:
Non-Photorealisitc Rendering (NPR) is a class of techniques that aims to reproduce artistic techniques, trying to express feelings and moods on the rendered scenes, giving an aspect of that they had been made "by hand". Another way of defining NPR is that it is the processing of scenes, images or videos into artwork, generating scenes, images or videos that can have the visual appeal of pieces of art, expressing the visual and emotional characteristics of artistic styles. This dissertation presents a new method of NPR for stylization of images and videos, based on a typical artistic expression of the Northeast region of Brazil, that uses colored sand to compose landscape images on the inner surface of glass bottles. This method is comprised by one technique for generating 2D procedural textures of sand, and two techniques that mimic effects created by the artists using their tools. It also presents a method for generating 21 2D animations in sandbox from the stylized video. The temporal coherence within these stylized videos can be enforced on individual objects with the aid of a video segmentation algorithm. The present techniques in this work were used on stylization of synthetic and real videos, something close to impossible to be produced by artist in real life
Resumo:
Baixo Vermelho area, situated on the northern portion of Umbuzeiro Graben (onshore Potiguar Basin), represents a typical example of a rift basin, characterized, in subsurface, by the sedimentary rift sequence, correlated to Pendência Formation (Valanginian-Barremian), and by the Carnaubais fault system. In this context, two main goals, the stratigraphic and the structural analysis, had guided the research. For this purpose, it was used the 3D seismic volume and eight wells located in the study area and adjacencies. The stratigraphic analysis of the Valanginian-Barremian interval was carried through in two distinct phases, 1D and 2D, in which the basic concepts of the sequence stratigraphy had been adapted. In these phases, the individual analysis of each well and the correlation between them, allowed to recognize the main lithofacies, to interpret the effective depositional systems and to identify the genetic units and key-surfaces of chronostratigraphic character. The analyzed lithofacies are represented predominantly by conglomerates, sandstones, siltites and shales, with carbonate rocks and marls occurring subordinately. According to these lithofacies associations, it is possible to interpret the following depositional systems: alluvial fan, fluvio-deltaic and lacustrine depositional systems. The alluvial fan system is mainly composed by conglomerates deposits, which had developed, preferentially in the south portion of the area, being directly associated to Carnaubais fault system. The fluvial-deltaic system, in turn, was mainly developed in the northwest portion of the area, at the flexural edge, being characterized by coarse sandstones with shales and siltites intercalated. On the other hand, the lacustrine system, the most dominant one in the study area, is formed mainly by shales that could occur intercalated with thin layers of fine to very fine sandstones, interpreted as turbidite deposits. The recognized sequence stratigraphy units in the wells are represented by parasequence sets, systems tracts and depositional sequences. The parasequence sets, which are progradational or retrogradational, had been grouped and related to the systems tracts. The predominance of the progradation parasequence sets (general trend with coarsening-upward) characterizes the Regressive Systems Tract, while the occurrence, more frequently, of the retrogradation parasequence sets (general trend with finning-upward) represents the Transgressive System Tract. In the seismic stratigraphic analysis, the lithofacies described in the wells had been related to chaotic, progradational and parallel/subparallel seismic facies, which are associated, frequently, to the alluvial fans, fluvial-deltaic and lacustrine depositional systems, respectively. In this analysis, it was possible to recognize fifteen seismic horizons that correspond to sequence boundaries and to maximum flooding surfaces, which separates Transgressive to Regressive systems tracts. The recognition of transgressive-regressive cycles allowed to identify nine, possibly, 3a order deposicional sequences, related to the tectonic-sedimentary cycles. The structural analysis, in turn, was done at Baixo Vermelho seismic volume, which shows, clearly, the structural complexity printed in the area, mainly related to Carnaubais fault system, acting as an important fault system of the rift edge. This fault system is characterized by a main arrangement of normal faults with trend NE-SO, where Carnaubais Fault represents the maximum expression of these lineations. Carnaubais Fault corresponds to a fault with typically listric geometry, with general trend N70°E, dipping to northwest. It is observed, throughout all the seismic volume, with variations in its surface, which had conditioned, in its evolutive stages, the formation of innumerable structural features that normally are identified in Pendencia Formation. In this unit, part of these features is related to the formation of longitudinal foldings (rollover structures and distentional folding associated), originated by the displacement of the main fault plan, propitiating variations in geometry and thickness of the adjacent layers, which had been deposited at the same time. Other structural features are related to the secondary faultings, which could be synthetic or antithetic to Carnaubais Fault. In a general way, these faults have limited lateral continuity, with listric planar format and, apparently, they play the role of the accomodation of the distentional deformation printed in the area. Thus, the interaction between the stratigraphic and structural analysis, based on an excellent quality of the used data, allowed to get one better agreement on the tectonicsedimentary evolution of the Valanginian-Barremian interval (Pendência Formation) in the studied area
Resumo:
The seismic reflection is used on a large scale in oil exploration. In case of marine acquisition the presence of high impedance contrast at the interfaces water/air generates multiple reflections events. Such multiple events can mask primary events; thus from the interpretational viewpoint it is necessary to mitigate the multiples. In this manuscript we compare two methods of multiple attenuation: the predictive multichannel deconvolution (DPM) and the F-K filtering (FKF). DPM is based in the periodicity of the multiples while FKF is based in multiples and primaries splitting in F-K domain. DPM and FKF were applied in common-offset and CDP gathers, respectively. DPM is quite sensible to the correct identification of the period and size of the filter while FKF is quite sensible to an adequate choice of the velocity in order to split multiples and primaries events in the F-K domain. DPM is a method that is designed to act over a specific event. So, when the parameters are well selected, DPM is very efficient in removing the specified multiple. Then DPM can be optimized by applying it several times, each time with a different parameterization. A deficiency of DPM occurs when a multiple is superposed to a primary event: in this situation, DPM can attenuate also the primary event. On the other hand, FKF presents almost the same performance to all multiples that are localized in the same sector of the F-K domain. The two methods can be combined in order to take advantage of their associated potentials. In this situation, DPM is firstly applied, with a focus in the sea bed multiples. Then FKF is applied in order to attenuate the remaining multiples
Resumo:
The textile industry is one of the most polluting in the world (AHMEDCHEKKAT et al. 2011), generating wastewater with high organic loading. Among the pollutants present in these effluents are dyes, substances with complex structures, toxic and carcinogenic characteristics, besides having a strong staining. Improper disposal of these substances to the environment, without performing a pre-treatment can cause major environmental impacts. The objective this thesis to use a technique of electrochemical oxidation of boron doped diamond anode, BDD, for the treatment of a synthetic dye and a textile real effluent. In addition to studying the behavior of different electrolytes (HClO4, H3PO4, NaCl and Na2SO4) and current densities (15, 60, 90 and 120 mA.cm-2 ), and compare the methods with Rhodamine B (RhB) photolysis, electrolysis and photoelectrocatalytic using H3PO4 and Na2SO4. Electrochemical oxidation studies were performed in different ratio sp3 /sp2 of BDD with solution of RhB. To achieve these objectives, analysis of pH, conductivity, UV-visible, TOC, HPLC and GC-MS were developed. Based on the results with the Rhodamine B, it was observed that in all cases occurred at mineralization, independent of electrolyte and current density, but these parameters affect the speed and efficiency of mineralization. The radiation of light was favorable during the electrolysis of RhB with phosphate and sulfate. Regarding the oxidation in BDD anode with different ratio sp3 /sp2 (165, 176, 206, 220, 262 e 329), with lower carbon-sp3 had a longer favoring the electrochemical conversion of RhB, instead of combustion. The greater the carbon content on the anodes BDD took the biggest favor of direct electrochemical oxidation
Resumo:
Digital image segmentation is the process of assigning distinct labels to different objects in a digital image, and the fuzzy segmentation algorithm has been used successfully in the segmentation of images from several modalities. However, the traditional fuzzy segmentation algorithm fails to segment objects that are characterized by textures whose patterns cannot be successfully described by simple statistics computed over a very restricted area. In this paper we present an extension of the fuzzy segmentation algorithm that achieves the segmentation of textures by employing adaptive affinity functions as long as we extend the algorithm to tridimensional images. The adaptive affinity functions change the size of the area where they compute the texture descriptors, according to the characteristics of the texture being processed, while three dimensional images can be described as a finite set of two-dimensional images. The algorithm then segments the volume image with an appropriate calculation area for each texture, making it possible to produce good estimates of actual volumes of the target structures of the segmentation process. We will perform experiments with synthetic and real data in applications such as segmentation of medical imaging obtained from magnetic rosonance
Resumo:
Heavy metals are present in industrial waste. These metals can generate a large environmental impact contaminating water, soil and plants. The chemical action of heavy metals has attracted environmental interest. In this context, this study aimed to test t he performance of electrochemical technologies for removing and quantifying heavy metals. First ly , the electroanalytical technique of stripping voltammetry with glassy carbon electrode (GC) was standardized in order to use this method for the quantificatio n of metals during their removal by electrocoagulation process (EC). A nalytical curves were evaluated to obtain reliability of the determin ation and quantification of Cd 2+ and Pb 2+ separately or in a mixture. Meanwhile , EC process was developed using an el ectrochemical cell in a continuous flow (EFC) for removing Pb 2+ and Cd 2+ . The se experiments were performed using Al parallel plates with 10 cm of diameter ( 63.5 cm 2 ) . The optimization of conditions for removing Pb 2+ and Cd 2+ , dissolved in 2 L of solution at 151 L h - 1 , were studied by applying different values of current for 30 min. Cd 2+ and Pb 2+ concentrations were monitored during electrolysis using stripping voltammetry. The results showed that the removal of Pb 2 + was effective when the EC pro cess is used, obtaining removals of 98% in 30 min. This behavior is dependent on the applied current, which implies an increase in power consumption. From the results also verified that the stripping voltammetry technique is quite reliable deter mining Pb 2+ concentration , when compared with the measurements obtained by atomic absorption method (AA). In view of this, t he second objective of this study was to evaluate the removal of Cd 2+ and Pb 2+ (mixture solution) by EC . Removal efficiency increasing current was confirmed when 93% and 100% of Cd 2+ and Pb 2+ was removed after 30 min . The increase in the current promotes the oxidation of sacrificial electrodes, and consequently increased amount of coagulant, which influences the removal of heavy metals in solution. Adsortive voltammetry is a fast, reliable, economical and simple way to determine Cd 2+ and Pb 2+ during their removal. I t is more economical than those normally used, which require the use of toxic and expensive reagents. Our results demonstrated the potential use of electroanalytical techniques to monitor the course of environmental interventions. Thus, the application of the two techniques associated can be a reliable way to monitor environmental impacts due to the pollution of aquatic ecosystems by heavy metals.
Resumo:
In this work it were developed synthetic and theoretical studies for clerodane-type diterpenes obtained from Croton cajucara Benth which represents one of the most important medicinal plant of the Brazil amazon region. Specifically, the majoritary biocompound 19-nor-clerodane trans-dehydrocrotonin (t-DCTN) isolated from the bark of this Croton, was used as target molecule. Semi-synthetic derivatives were obtained from t-DCTN by using the followed synthetic procedures: 1) catalytic reduction with H2, 2) reduction using NaBH4 and 3) reduction using NaBH4/CeCl3. The semi-synthetic 19-nor-furan-clerodane alcohol-type derivatives were denominated such as t-CTN, tCTN-OL, t-CTN-OL, t-DCTN-OL, t-DCTN-OL, being all of them characterized by NMR. The furan-clerodane alcohol derivatives t-CTN-OL and tCTN-OL were obtained form the semi-synthetic t-CTN, which can be isolated from the bark of C. cajucara. A theoretical protocol (DFT/B3LYP) involving the prevision of geometric and magnetic properties such as bond length and angles, as well as chemical shifts and coupling constants, were developed for the target t-DCTN in which was correlated NMR theoretical data with structural data, with satisfactory correlation with NMR experimental data (coefficients ranging from 0.97 and 0.99) and X-ray diffraction data. This theoretical methodology was also validated for all semi-synthetic derivatives described in this work. In addition, topological data from the Quantum Theory of Atoms in Molecules (QTAIM) showed the presence of H-H and (C)O--H(C) intramolecular stabilized interactions types for t-DCTN e t-CTN, contributing to the understanding of the different reactivity of this clerodanes in the presence of NaBH4.
Resumo:
The reverse time migration algorithm (RTM) has been widely used in the seismic industry to generate images of the underground and thus reduce the risk of oil and gas exploration. Its widespread use is due to its high quality in underground imaging. The RTM is also known for its high computational cost. Therefore, parallel computing techniques have been used in their implementations. In general, parallel approaches for RTM use a coarse granularity by distributing the processing of a subset of seismic shots among nodes of distributed systems. Parallel approaches with coarse granularity for RTM have been shown to be very efficient since the processing of each seismic shot can be performed independently. For this reason, RTM algorithm performance can be considerably improved by using a parallel approach with finer granularity for the processing assigned to each node. This work presents an efficient parallel algorithm for 3D reverse time migration with fine granularity using OpenMP. The propagation algorithm of 3D acoustic wave makes up much of the RTM. Different load balancing were analyzed in order to minimize possible losses parallel performance at this stage. The results served as a basis for the implementation of other phases RTM: backpropagation and imaging condition. The proposed algorithm was tested with synthetic data representing some of the possible underground structures. Metrics such as speedup and efficiency were used to analyze its parallel performance. The migrated sections show that the algorithm obtained satisfactory performance in identifying subsurface structures. As for the parallel performance, the analysis clearly demonstrate the scalability of the algorithm achieving a speedup of 22.46 for the propagation of the wave and 16.95 for the RTM, both with 24 threads.
Resumo:
In this work, the treatment of wastewater from the textile industry, containing dyes as Yellow Novacron (YN), Red Remazol BR (RRB) and Blue Novacron CD (NB), and also, the treatment of wastewater from petrochemical industry (produced water) were investigated by anodic oxidation (OA) with platinum anodes supported on titanium (Ti/Pt) and boron-doped diamond (DDB). Definitely, one of the main parameters of this kind of treatment is the type of electrocatalytic material used, since the mechanisms and products of some anodic reactions depend on it. The OA of synthetic effluents containing with RRB, NB and YN were investigated in order to find the best conditions for the removal of color and organic content of the dye. According to the experimental results, the process of OA is suitable for decolorization of wastewaters containing these textile dyes due to electrocatalytic properties of DDB and Pt anodes. Removal of the organic load was more efficient at DDB, in all cases; where the dyes were degraded to aliphatic carboxylic acids at the end of the electrolysis. Energy requirements for the removal of color during OA of solutions of RRB, NB and YN depends mainly on the operating conditions, for example, RRB passes of 3.30 kWh m-3 at 20 mA cm-2 for 4.28 kWh m-3 at 60 mA cm-2 (pH = 1); 15.23 kWh m-3 at 20 mA cm-2 to 24.75 kWh m-3 at 60 mA cm-2 (pH 4.5); 10.80 kWh m-3 at 20 mA cm-2 to 31.5 kWh m-3 at 60 mA cm-2 (pH = 8) (estimated data for volume of treated effluent). On the other hand, in the study of OA of produced water effluent generated by petrochemical industry, galvanostatic electrolysis using DDB led to the complete removal of COD (98%), due to large amounts of hydroxyl radicals and peroxodisulphates generated from the oxidation of water and sulfates in solution, respectively. Thus, the rate of COD removal increases with increasing applied current density (15-60 mAcm-2 ). Moreover, at Pt electrode, approximately 50% removal of the organic load was achieved by applying from 15 to 30 mAcm-2 while 80% of COD removal was achieved for 60 mAcm-2 . Thus, the results obtained in the application of this technology were satisfactory depending on the electrocatalytic materials and operating conditions used for removal of organic load (petrochemical and textile effluents) as well as for the removal of color (in the case of textile effluents). Therefore, the applicability of electrochemical treatment can be considered as a new alternative like pretreatment or treatment of effluents derived from textiles and petrochemical industries.
Resumo:
Google Docs (GD) is an online word processor with which multiple authors can work on the same document, in a synchronous or asynchronous manner, which can help develop the ability of writing in English (WEISSHEIMER; SOARES, 2012). As they write collaboratively, learners find more opportunities to notice the gaps in their written production, since they are exposed to more input from the fellow co-authors (WEISSHEIMER; BERGSLEITHNER; LEANDRO, 2012) and prioritize the process of text (re)construction instead of the concern with the final product, i.e., the final version of the text (LEANDRO; WEISSHEIMER; COOPER, 2013). Moreover, when it comes to second language (L2) learning, producing language enables the consolidation of existing knowledge as well as the internalization of new knowledge (SWAIN, 1985; 1993). Taking this into consideration, this mixed-method (DÖRNYEI, 2007) quasi-experimental (NUNAN, 1999) study aims at investigating the impact of collaborative writing through GD on the development of the writing skill in English and on the noticing of syntactic structures (SCHMIDT, 1990). Thirtyfour university students of English integrated the cohort of the study: twenty-five were assigned to the experimental group and nine were assigned to the control group. All learners went through a pre-test and a post-test so that we could measure their noticing of syntactic structures. Learners in the experimental group were exposed to a blended learning experience, in which they took reading and writing classes at the university and collaboratively wrote three pieces of flash fiction (a complete story told in a hundred words), outside the classroom, online through GD, during eleven weeks. Learners in the control group took reading and writing classes at the university but did not practice collaborative writing. The first and last stories produced by the learners in the experimental group were analysed in terms of grammatical accuracy, operationalized as the number of grammar errors per hundred words (SOUSA, 2014), and lexical density, which refers to the relationship between the number of words produced with lexical properties and the number of words produced with grammatical properties (WEISSHEIMER, 2007; MEHNERT, 1998). Additionally, learners in the experimental group answered an online questionnaire on the blended learning experience they were exposed to. The quantitative results showed that the collaborative task led to the production of more lexically dense texts over the 11 weeks. The noticing and grammatical accuracy results were different from what we expected; however, they provide us with insights on measurement issues, in the case of noticing, and on the participants‟ positive attitude towards collaborative writing with flash fiction. The qualitative results also shed light on the usefulness of computer-mediated collaborative writing in L2 learning.
Resumo:
In line with the model of grammar competition (Kroch, 1989; 2001), according to which the change in the syntactic domains is a process that develops via competition between different grammars, we describe and analyze the superficial constructions V2 / V3 in matrices / roots sentences of brazilian personal letters of the 19th and 20th centuries. The corpus, composed by 154 personal letters of Rio de Janeiro and Rio Grande do Norte, is divided into three century halves: (i) latter half of the 19th century; (ii) first half of the 20th century; and (iii) latter half of the 20th century. Our focus was the observation of the nature of preverbal constituents in superficial constructions V2 (verb in second position in the sentence) and V3 (verb in third position in the sentence), with a special attention on the position of the subject. Based on the various diachronical studies about the Portuguese ordination standards (Ambar (1992); Ribeiro (1995, 2001); Paixão de Sousa (2004); Paiva (2011), Coelho and Martins (2009, 2012)), our study sought to realize what are empirical ordination standards that involve superficial constructions V2 / V3 and how these patterns structure syntactically within a formal theoretical perspective (Chomsky, 1981; 1986), more specifically, in accordance with studies of Antonelli (2011), and Costa & Galves (2002). The survey results show that the data from the second half of the 19th century – unlike the first and second half of the 20th century data – have a greater balance in relation to the syntactic nature of preverbal constituent (contiguous or not), so that, in this period, the occurrence of orders with the subject in a preverbal position arrives at, at most, 52% (231/444 data); while in the 48% (213/444 data) remaining, the preverbal constituents are represented by a non-subject constituent, almost always an adverbial adjunct. Seen the results, we advocate that the brazilian personal letters of the 19th century have ordination patterns associated with a V2 system and an SV system, configuring, therefore, a possible competition process between different grammars that instantiate or a V2 system or an SV system. In other words, the brazilian letters of the 19th century instantiate a competition between the grammar of Classic Portuguese (a V2 system) and the grammars of Brazilian Portuguese and European Portuguese (an SV system). Therefore, that period is subject to the completion of two distinct parametric markings: (i) verb moved to the Fin core (grammar of Classic Portuguese) and (ii) verb moved to the T core (grammar of Brazilian Portuguese /European Portuguese). On the other hand, in the personal letters of the 20th century (first and second halves), there is a clear increase in ordenation patterns associated with the SV system, which shows more stable.
Resumo:
Textile industry has been a cause of environmental pollution, mainly due to the generation of large volumes of waste containing high organic loading and intense color. In this context, this study evaluated the electrochemical degradation of synthetic effluents from textile industry containing Methylene Blue (AM) dye, using Ti/IrO2-Ta2O5 and Ti/Pt anodes, by direct and indirect (active chlorine) electrooxidation. We evaluated the influence of applied current density (20, 40 and 60 mA/cm2 ), and the presence of different concentrations of electrolyte (NaCl and Na2SO4), as well as the neutral and alkaline pH media. The electrochemical treatment was conducted in a continuous flow reactor, in which the electrolysis time of the AM 100 ppm was 6 hours. The performance of electrochemical process was evaluated by UV-vis spectrophotometry, chemical oxygen demand (COD) and total organic carbon (TOC). The results showed that with increasing current density, it was possible to obtain 100 % of color removal at Ti/IrO2-Ta2O5 and Ti/Pt electrodes. Regarding the color removal efficiency, increasing the concentration of electrolyte promotes a higher percentage of removal using 0,02 M Na2SO4 and 0,017 M NaCl. Concerning to the aqueous medium, the best color removal results were obtained in alkaline medium using Ti/Pt. In terms of organic matter, 86 % was achieved in neutral pH medium for Ti/Pt; while a 30 % in an alkaline medium. To understand the electrochemical behavior due to the oxygen evolution reaction, polarization curves were registered, determining that the presence of NaCl in the solution favored the production of active chlorine species. The best results in energy consumption and cost were obtained by applying lower current density (20 mA/cm2 ) in 6 hours of electrolysis.
Resumo:
Synthesis of heterocyclic compounds, as quinoxaline derivatives, has being shown to be relevant and promissor due to expressive applications in biological and technological areas. This work was dedicated to the synthesis, characterization and reactivity of quinoxaline derivatives in order to obtain new chemosensors. (L)-Ascorbic acid (1) and 2,3-dichloro-6,7- dinitroquinoxalina (2) were explored as synthetic precursors. Starting from synthesis of 1 and characterization of compounds derived from (L)-ascorbic acid, studies were performed investigating the application of products as chemosensors, in which compound 36 demonstrated selective affinity for Cu2+ íons in methanolic solution, by naked-eye (colorimetric) and UVvisible analyses. Further, initial analysis suggests that 39 a Schiff’s base derived from 36 also presents this feature. Five quinoxaline derivatives were synthesized from building block 2 through nucleophilic aromatic substitution by aliphatic amines, in which controlling the experimental conditions allows to obtain both mono- and di-substituted derivatives. Reactivity studies were carried out with two purposes: i) investigate the possibility of 47 compound being a chemosensor for anion, based on its interaction with sodium hydroxide in DMSO, using image analysis and UV-visible spectroscopy; ii) characterize kinetically the conversion of compound 44 into 46 based on RGB and multivariate image analysis from TLC data, as a simple and inexpensive qualitative and quantitative tool.
Resumo:
In the oil prospection research seismic data are usually irregular and sparsely sampled along the spatial coordinates due to obstacles in placement of geophones. Fourier methods provide a way to make the regularization of seismic data which are efficient if the input data is sampled on a regular grid. However, when these methods are applied to a set of irregularly sampled data, the orthogonality among the Fourier components is broken and the energy of a Fourier component may "leak" to other components, a phenomenon called "spectral leakage". The objective of this research is to study the spectral representation of irregularly sampled data method. In particular, it will be presented the basic structure of representation of the NDFT (nonuniform discrete Fourier transform), study their properties and demonstrate its potential in the processing of the seismic signal. In this way we study the FFT (fast Fourier transform) and the NFFT (nonuniform fast Fourier transform) which rapidly calculate the DFT (discrete Fourier transform) and NDFT. We compare the recovery of the signal using the FFT, DFT and NFFT. We approach the interpolation of seismic trace using the ALFT (antileakage Fourier transform) to overcome the problem of spectral leakage caused by uneven sampling. Applications to synthetic and real data showed that ALFT method works well on complex geology seismic data and suffers little with irregular spatial sampling of the data and edge effects, in addition it is robust and stable with noisy data. However, it is not as efficient as the FFT and its reconstruction is not as good in the case of irregular filling with large holes in the acquisition.
Resumo:
In the oil prospection research seismic data are usually irregular and sparsely sampled along the spatial coordinates due to obstacles in placement of geophones. Fourier methods provide a way to make the regularization of seismic data which are efficient if the input data is sampled on a regular grid. However, when these methods are applied to a set of irregularly sampled data, the orthogonality among the Fourier components is broken and the energy of a Fourier component may "leak" to other components, a phenomenon called "spectral leakage". The objective of this research is to study the spectral representation of irregularly sampled data method. In particular, it will be presented the basic structure of representation of the NDFT (nonuniform discrete Fourier transform), study their properties and demonstrate its potential in the processing of the seismic signal. In this way we study the FFT (fast Fourier transform) and the NFFT (nonuniform fast Fourier transform) which rapidly calculate the DFT (discrete Fourier transform) and NDFT. We compare the recovery of the signal using the FFT, DFT and NFFT. We approach the interpolation of seismic trace using the ALFT (antileakage Fourier transform) to overcome the problem of spectral leakage caused by uneven sampling. Applications to synthetic and real data showed that ALFT method works well on complex geology seismic data and suffers little with irregular spatial sampling of the data and edge effects, in addition it is robust and stable with noisy data. However, it is not as efficient as the FFT and its reconstruction is not as good in the case of irregular filling with large holes in the acquisition.