201 resultados para Compressão de dados (Telecomunicações)
Resumo:
In systems that combine the outputs of classification methods (combination systems), such as ensembles and multi-agent systems, one of the main constraints is that the base components (classifiers or agents) should be diverse among themselves. In other words, there is clearly no accuracy gain in a system that is composed of a set of identical base components. One way of increasing diversity is through the use of feature selection or data distribution methods in combination systems. In this work, an investigation of the impact of using data distribution methods among the components of combination systems will be performed. In this investigation, different methods of data distribution will be used and an analysis of the combination systems, using several different configurations, will be performed. As a result of this analysis, it is aimed to detect which combination systems are more suitable to use feature distribution among the components
Resumo:
Na computação científica é necessário que os dados sejam o mais precisos e exatos possível, porém a imprecisão dos dados de entrada desse tipo de computação pode estar associada às medidas obtidas por equipamentos que fornecem dados truncados ou arredondados, fazendo com que os cálculos com esses dados produzam resultados imprecisos. Os erros mais comuns durante a computação científica são: erros de truncamentos, que surgem em dados infinitos e que muitas vezes são truncados", ou interrompidos; erros de arredondamento que são responsáveis pela imprecisão de cálculos em seqüências finitas de operações aritméticas. Diante desse tipo de problema Moore, na década de 60, introduziu a matemática intervalar, onde foi definido um tipo de dado que permitiu trabalhar dados contínuos,possibilitando, inclusive prever o tamanho máximo do erro. A matemática intervalar é uma saída para essa questão, já que permite um controle e análise de erros de maneira automática. Porém, as propriedades algébricas dos intervalos não são as mesmas dos números reais, apesar dos números reais serem vistos como intervalos degenerados, e as propriedades algébricas dos intervalos degenerados serem exatamente as dos números reais. Partindo disso, e pensando nas técnicas de especificação algébrica, precisa-se de uma linguagem capaz de implementar uma noção auxiliar de equivalência introduzida por Santiago [6] que ``simule" as propriedades algébricas dos números reais nos intervalos. A linguagem de especificação CASL, Common Algebraic Specification Language, [1] é uma linguagem de especificação algébrica para a descrição de requisitos funcionais e projetos modulares de software, que vem sendo desenvolvida pelo CoFI, The Common Framework Initiative [2] a partir do ano de 1996. O desenvolvimento de CASL se encontra em andamento e representa um esforço conjunto de grandes expoentes da área de especificações algébricas no sentido de criar um padrão para a área. A dissertação proposta apresenta uma especificação em CASL do tipo intervalo, munido da aritmética de Moore, afim de que ele venha a estender os sistemas que manipulem dados contínuos, sendo possível não só o controle e a análise dos erros de aproximação, como também a verificação algébrica de propriedades do tipo de sistema aqui mencionado. A especificação de intervalos apresentada aqui foi feita apartir das especificações dos números racionais proposta por Mossakowaski em 2001 [3] e introduz a noção de igualdade local proposta por Santiago [6, 5, 4]
Resumo:
The increase of applications complexity has demanded hardware even more flexible and able to achieve higher performance. Traditional hardware solutions have not been successful in providing these applications constraints. General purpose processors have inherent flexibility, since they perform several tasks, however, they can not reach high performance when compared to application-specific devices. Moreover, since application-specific devices perform only few tasks, they achieve high performance, although they have less flexibility. Reconfigurable architectures emerged as an alternative to traditional approaches and have become an area of rising interest over the last decades. The purpose of this new paradigm is to modify the device s behavior according to the application. Thus, it is possible to balance flexibility and performance and also to attend the applications constraints. This work presents the design and implementation of a coarse grained hybrid reconfigurable architecture to stream-based applications. The architecture, named RoSA, consists of a reconfigurable logic attached to a processor. Its goal is to exploit the instruction level parallelism from intensive data-flow applications to accelerate the application s execution on the reconfigurable logic. The instruction level parallelism extraction is done at compile time, thus, this work also presents an optimization phase to the RoSA architecture to be included in the GCC compiler. To design the architecture, this work also presents a methodology based on hardware reuse of datapaths, named RoSE. RoSE aims to visualize the reconfigurable units through reusability levels, which provides area saving and datapath simplification. The architecture presented was implemented in hardware description language (VHDL). It was validated through simulations and prototyping. To characterize performance analysis some benchmarks were used and they demonstrated a speedup of 11x on the execution of some applications
Resumo:
Motion estimation is the main responsible for data reduction in digital video encoding. It is also the most computational damanding step. H.264 is the newest standard for video compression and was planned to double the compression ratio achievied by previous standards. It was developed by the ITU-T Video Coding Experts Group (VCEG) together with the ISO/IEC Moving Picture Experts Group (MPEG) as the product of a partnership effort known as the Joint Video Team (JVT). H.264 presents novelties that improve the motion estimation efficiency, such as the adoption of variable block-size, quarter pixel precision and multiple reference frames. This work defines an architecture for motion estimation in hardware/software, using a full search algorithm, variable block-size and mode decision. This work consider the use of reconfigurable devices, soft-processors and development tools for embedded systems such as Quartus II, SOPC Builder, Nios II and ModelSim
Resumo:
The use of clustering methods for the discovery of cancer subtypes has drawn a great deal of attention in the scientific community. While bioinformaticians have proposed new clustering methods that take advantage of characteristics of the gene expression data, the medical community has a preference for using classic clustering methods. There have been no studies thus far performing a large-scale evaluation of different clustering methods in this context. This work presents the first large-scale analysis of seven different clustering methods and four proximity measures for the analysis of 35 cancer gene expression data sets. Results reveal that the finite mixture of Gaussians, followed closely by k-means, exhibited the best performance in terms of recovering the true structure of the data sets. These methods also exhibited, on average, the smallest difference between the actual number of classes in the data sets and the best number of clusters as indicated by our validation criteria. Furthermore, hierarchical methods, which have been widely used by the medical community, exhibited a poorer recovery performance than that of the other methods evaluated. Moreover, as a stable basis for the assessment and comparison of different clustering methods for cancer gene expression data, this study provides a common group of data sets (benchmark data sets) to be shared among researchers and used for comparisons with new methods
Resumo:
The use of middleware technology in various types of systems, in order to abstract low-level details related to the distribution of application logic, is increasingly common. Among several systems that can be benefited from using these components, we highlight the distributed systems, where it is necessary to allow communications between software components located on different physical machines. An important issue related to the communication between distributed components is the provision of mechanisms for managing the quality of service. This work presents a metamodel for modeling middlewares based on components in order to provide to an application the abstraction of a communication between components involved in a data stream, regardless their location. Another feature of the metamodel is the possibility of self-adaptation related to the communication mechanism, either by updating the values of its configuration parameters, or by its replacement by another mechanism, in case of the restrictions of quality of service specified are not being guaranteed. In this respect, it is planned the monitoring of the communication state (application of techniques like feedback control loop), analyzing performance metrics related. The paradigm of Model Driven Development was used to generate the implementation of a middleware that will serve as proof of concept of the metamodel, and the configuration and reconfiguration policies related to the dynamic adaptation processes. In this sense was defined the metamodel associated to the process of a communication configuration. The MDD application also corresponds to the definition of the following transformations: the architectural model of the middleware in Java code, and the configuration model to XML
Resumo:
Nowadays several electronics devices support digital videos. Some examples of these devices are cellphones, digital cameras, video cameras and digital televisions. However, raw videos present a huge amount of data, millions of bits, for their representation as the way they were captured. To store them in its primary form it would be necessary a huge amount of disk space and a huge bandwidth to allow the transmission of these data. The video compression becomes essential to make possible information storage and transmission. Motion Estimation is a technique used in the video coder that explores the temporal redundancy present in video sequences to reduce the amount of data necessary to represent the information. This work presents a hardware architecture of a motion estimation module for high resolution videos according to H.264/AVC standard. The H.264/AVC is the most advanced video coder standard, with several new features which allow it to achieve high compression rates. The architecture presented in this work was developed to provide a high data reuse. The data reuse schema adopted reduces the bandwidth required to execute motion estimation. The motion estimation is the task responsible for the largest share of the gains obtained with the H.264/AVC standard so this module is essential for final video coder performance. This work is included in Rede H.264 project which aims to develop Brazilian technology for Brazilian System of Digital Television
Resumo:
The main goal of this work is to investigate the suitability of applying cluster ensemble techniques (ensembles or committees) to gene expression data. More specifically, we will develop experiments with three diferent cluster ensembles methods, which have been used in many works in literature: coassociation matrix, relabeling and voting, and ensembles based on graph partitioning. The inputs for these methods will be the partitions generated by three clustering algorithms, representing diferent paradigms: kmeans, ExpectationMaximization (EM), and hierarchical method with average linkage. These algorithms have been widely applied to gene expression data. In general, the results obtained with our experiments indicate that the cluster ensemble methods present a better performance when compared to the individual techniques. This happens mainly for the heterogeneous ensembles, that is, ensembles built with base partitions generated with diferent clustering algorithms
Resumo:
The increasing complexity of integrated circuits has boosted the development of communications architectures like Networks-on-Chip (NoCs), as an architecture; alternative for interconnection of Systems-on-Chip (SoC). Networks-on-Chip complain for component reuse, parallelism and scalability, enhancing reusability in projects of dedicated applications. In the literature, lots of proposals have been made, suggesting different configurations for networks-on-chip architectures. Among all networks-on-chip considered, the architecture of IPNoSys is a non conventional one, since it allows the execution of operations, while the communication process is performed. This study aims to evaluate the execution of data-flow based applications on IPNoSys, focusing on their adaptation against the design constraints. Data-flow based applications are characterized by the flowing of continuous stream of data, on which operations are executed. We expect that these type of applications can be improved when running on IPNoSys, because they have a programming model similar to the execution model of this network. By observing the behavior of these applications when running on IPNoSys, were performed changes in the execution model of the network IPNoSys, allowing the implementation of an instruction level parallelism. For these purposes, analysis of the implementations of dataflow applications were performed and compared
Resumo:
Symbolic Data Analysis (SDA) main aims to provide tools for reducing large databases to extract knowledge and provide techniques to describe the unit of such data in complex units, as such, interval or histogram. The objective of this work is to extend classical clustering methods for symbolic interval data based on interval-based distance. The main advantage of using an interval-based distance for interval-based data lies on the fact that it preserves the underlying imprecision on intervals which is usually lost when real-valued distances are applied. This work includes an approach allow existing indices to be adapted to interval context. The proposed methods with interval-based distances are compared with distances punctual existing literature through experiments with simulated data and real data interval
Resumo:
Geographic Information System (GIS) are computational tools used to capture, store, consult, manipulate, analyze and print geo-referenced data. A GIS is a multi-disciplinary system that can be used by different communities of users, each one having their own interest and knowledge. This way, different knowledge views about the same reality need to be combined, in such way to attend each community. This work presents a mechanism that allows different community users access the same geographic database without knowing its particular internal structure. We use geographic ontologies to support a common and shared understanding of a specific domain: the coral reefs. Using these ontologies' descriptions that represent the knowledge of the different communities, mechanisms are created to handle with such different concepts. We use equivalent classes mapping, and a semantic layer that interacts with the ontologies and the geographic database, and that gives to the user the answers about his/her queries, independently of the used terms
Resumo:
Northeastern Brazil is mainly formed by crystalline terrains (around 60% in area). Moreover, this region presents a semi-arid climate so that it is periodically subject to drought seasons. Furthermore, ground water quality extracted fromwells usually presents poor quality because of their high salinity contents. Nevertheless, ground water is still a very important source of water for human and animal consumption in this region. Well sitting in hard rocks terrains in Northeastern Brazil offers a mean success index of aboul 60%, given that a successful siting is defined by a well producing at least 0.5 m³/h. This low index reveals lack of knowledga about the true conditions of storage and percolation of ground water in crystalline rocks. Two models for structures storing and producing ground water in crystalline rocks in Northeastem Brazil have been proposed in the literature. The first model,tradnionally used for well sitting since the sixties are controlled by faults or fractures zones. This model is commonly referred, in Brazilian hydrogeological literature, as the "creek-crack" model (riacho-fenda in Portuguese). Sites appearing to present dense drainage network are preferred for water well siting - particularly at points where the drainages cross-cul each other. Field follow up work is usually based only on geological criteria. The second model is the "eluvio-alluvial through" (calha eluvio-aluvionar in Portuguese); it is also described in the literature but it is not yet incorporated in well sitting practice. This model is based on the hypothesis that reclilinear drainages can also be controlled by the folietion of the rock. Eventually, depending upon the degree of weathering, a through-shaped structure filled with sediments (alluvium and regolith) can be developed which can store and water can be produced from. Using severalfield case studies, this Thesis presents a thorough analysis ofthe two above cited models and proposes a new model. The analysis is based on an integrated methodological approach using geophysics and structural geology. Both land (Resitiviy and Ground Penetrating Radar- GPR) and aerogeophysical (magnetics and frequency domain eletromagnetics) surveys were used. Slructural analysis emphasized neolectonic aspects; in general, itwas found that fractures in the E-W direction are relatively open, as compared to fracturas inthe N-S direction, probably because E-W fractures were opened by the neotectonic stress regime in Northeastern Brazil, which is controlled by E-W compression and N-S extension. The riacho-fenda model is valid where drainages are controlled by fractures. The degree of fracturing and associated weathering dictale the hydrogeological potential of the structure. Field work in structural analogues reveals that subvertical fractures show consistent directions both in outcrop and aerophotograph scales. Geophysical surveys reveal subvertical conductive anomalies associated to the fracture network controlling the drainage; one of the borders of the conductive anomaly usually coincide wih the drainage. An aspect of particular importance to the validation of fracture control are the possible presence of relalively deep conductive anomalies wihoul continuation or propagalion to the surface. The conductive nature of lhe anomaly is due to the presence of wealhered rock and sedirnenls (alluvium and/or regolilh) storing ground waler which occur associated to the fracture network. Magnetic surveys are not very sensisnive to these structures.lf soil or covering sedirnents are resislive (> 100 Ohm.m), GPR can ba used to image precisely lhe fracture network. A major limialion of riacho-fenda model, revealed by GPR images, is associated to the fact thal subhorizontal fractures do play a very important role in connecting the fracture network, besides connect shallow recharge zones to relalively deep subvertical frecture zones. Iffractures play just a secondary control on the drainage, however, r/acho-fenda model may have a very limiled validny; in these cases, large portions oflhe drainage do nol coincide wilh frectures and mosl oflhewells localed in lhe drainage surrounding would resull dry. Usually, a secondary conlrol on lhe drainage by Ihefraclure networkcan be revealed only wilh detailed geophysical survey. The calha elClv1o-aluvlonarmodel is valid where drainages are conlrolled by folialion. The degree 01 wealhering 01 lhe lolialion planes dictales lhe hydrogeological polenlial 01 lhe slruclure. Outcrop analysis reveals Ihal lolialion and drainage direclions are parallel and Ihal no Iraclures, orfraclures wilh diflerent directions 01 lhe drainage direclion occur. Geophysical surveys reveal conduclive anomalies in a slab lorm associaled 10 lhe Ihrough 01 lhe wealhered rock and sedimenls (alluvium and/or regolith). Magnelic surveys can ofler a very good conlrol on lolialion direclion. An importanl aspect 10 validale lolialion conlrol are lhe presence 01 conductive anomalies showing shallow and deep portions area which are linked. Illhere is an exlensive soil cover, r/acho-fenda and calha eIClv1o-aluv/onar conlrols can be easily misinlerpreled in lhe absence 01 geophysical conlrol. Certainly, Ihis lacl could explain at leasl a part of lhe failure index in well sitting. The model wealhering sack (bolsllo de Intempertsmo in Portuguese) is proposed to explain cases where a very inlensive wealhering occur over lhe crystalline rock so Ihal a secondary inlerslilial porosity is crealed. The waler is Ihen stored in lhe porous of lhe regolilh in a similar mannerlo sedimentary rocks. A possible example ofthis model was delecled by using land geophysical survey where a relalivelyvery deep isolaled conduclive anomaly, in a slab form, was delected. Iflhis structure does store ground waler, certainly Ihere must be a link 01 lhe deep slructure wilh lhe surface in orderlo provide walerfeeding. This model mighl explain anomalous waler yields as greal as 50 m³/h Ihalsomelimescan occur in crystalline rocks in Northeaslern Brazil
Resumo:
This thesis encompasses the integration of geological, geophysical, and seismological data in the east part of the Potiguar basin, northeastern Brazil. The northeastern region is located in South American passive margin, which exhibits important areas that present neotectonic activity. The definition of the chronology of events, geometry of structures generated by these events, and definition of which structures have been reactivated is a necessary task in the region. The aims of this thesis are the following: (1) to identify the geometry and kinematics of neotectonic faults in the east part of the Potiguar basin; (2) to date the tectonic events related to these structures and related them to paleoseismicity in the region; (3) to present evolutional models that could explain evolution of Neogene structures; (4) and to investigate the origin of the reactivation process, mainly the type of related structure associated with faulting. The main type of data used comprised structural field data, well and resistivity data, remote sensing imagery, chronology of sediments, morphotectonic analysis, x-ray analysis, seismological and aeromagnetic data. Paleostress analysis indicates that at least two tectonic stress fields occurred in the study area: NSoriented compression and EW-oriented extension from the late Campanian to the early Miocene and EW-oriented compression and NS-oriented extension from the early Miocene to the Holocene. These stress fields reactivated NE-SW- and NW-SE-trending faults. Both set of faults exhibit right-lateral strike-slip kinematics, associated with a minor normal component. It was possible to determine the en echelon geometry of the Samambaia fault, which is ~63 km long, 13 km deep, presents NE-SW trend and strong dip to NW. Sedimentfilled faults in granite rocks yielded Optically Stimulated Luminescence (OSL) and Single-Aliquot Regeneration (SAR) ages at 8.000 - 9.000, 11.000 - 15.000, 16.000 - 24.000, 37.000 - 45.500, 53.609 - 67.959 e 83.000 - 84.000 yr BP. The analysis of the ductile fabric in the João Câmara area indicate that the regional foliation is NE-SW-oriented (032o - 042o), which coincides with the orientation of the epicenters and Si-rich veins. The collective evidence points to reactivation of preexisting structures. Paleoseismological data suggest paleoseismic activity much higher than the one indicated by the short historical and instrumental record
Resumo:
Hydrogeological prospecting in Northeast Brazil and in other crystalline terrains has been developed on the basis of structural and regional geology concepts that date back to the 50-60 decades and, as such, demand a natural re-evaluation and update. In this kind of terrain, the percolation and accumulation of ground water are controlled by fractures and other types of discontinuities, such as foliations and geological contacts that, through weathering, impart porosity and permeability to the rocks, allowing water flow and storage. Several factors should be considered in the process of locating water wells, as discussed in the literature. Among these, the kind of structures, fracture geometry (including aperture and connectivity) and their geological and chronological context. It is important to correlate fracture systems with the regional neotectonic framework. Fractures at low angle (sub parallel) with the principal stress axis (s1) are those which tend to open (actually they work as tension joints) and, in principle, would present major hydric potential; in the opposite side, fractures at high angle to s1 would behave as closed by a compressional component. Fractures diagonal to the compression and tension axes correspond to shear fractures and, due to their connectivity with second fractures, are also important in terms of hydric potential. Uplift followed by terrain denudation leads to decompression and a general tendency to open (aided by weathering processes) fractures and other rock discontinuities, at different orientations. Low angle fractures, formed in this context, are equally important to increase connectivity, collection of water and recharge of the aquifer systems. In a general way, an opening component (neotectonic or by terrain decompression) and several models to increase fracture connectivity correlate with a greater hydric potential of these structures. Together with parallel research, this thesis addresses models of ground water occurrence in crystalline terrains, either improving well established concepts like the (Riacho-Fenda model), but also stressing other possibilities, like the role of alluvium and paleo-regoliths (the Calha Elúvio-Aluvionar model) and of strongly altered, permo-porous zones placed at variable depths below the present surface, flanking several types of discontinuities, especially interconnected fracture arrays (the Bolsões de Intemperismo model). Different methodological approaches are also discussed in order to improve success rates in the location of water wells in crystalline terrains. In this methodological review, a number of case studies were selected in the eastern domain of the State of Rio Grande do Norte, involving the localities of Santa Cruz, Santo Antônio, Serrinha, Nova Cruz, Montanhas, Lagoa de Pedras and Lagoa Salgada. Besides the neotectonic analysis of brittle structures, this Thesis addresses the validation of remote sensing as a tool for ground water prospecting. Several techniques were tested in order to detect and select areas with higher potential for ground water accumulation, using Landsat 5-TM and RADARSAT images, besides conventional aerial photos. A number of filters were tested to emphasize lineaments in the images, improving their discrimination, to identify areas with higher overburden humidity, which could reflect subsurface water accumulation, as well as alluvium and other sedimentary covers that might act as recharge zones. The work started with a regional analysis with the orbital images, followed by analysis of aerial photos, up to a detailed structural study of rock exposures in the terrain. This last step involved the analysis of outcrops surrounding wells (in a ray of approximately 10 to 100 m) with distinct productivities, including dry examples. At the level required for detail, it was not possible to accomplish a statistical approach using the available well data catalogs, which lack the desired specific information. The methodology worked out in this Thesis must undergo a testing phase through location of new water wells. An increase in the success rates as desired will led to a further consolidation step with wider divulgation of the methodology to private companies and governmental agencies involved in ground water prospecting in crystalline terrains
Resumo:
The study of Brazilian sedimentary basins concentrates on their rift phase, whereas the Post-rift phase has been considered a tectonic quiescent period. The post-rift sequence of the Potiguar Basin, in the far northeastern Brazil, was once considered little deformed, however several studies have shown how that it was affected by major fault systems. The purpose of this thesis is to characterize the post-rift tectonic. The specific objectives are: to characterize the Neogene and Quaternary sedimentary units that outcrop of the Potiguar Basin; to show how the NW-SEtrending Afonso Bezerra Faults System deformed outcrop rocks in the Basin; to describe soft-sediment deformation in gravels of the Quaternary Alluvial Deposits from Açu River. Facies analyses, grain-size studies, luminescence dating, remote sensing, structural mapping, shallow geophysics (georadar), paleostress and petrography were carried out. The structural mapping and the georadar sections indicated that the Carnaubais and Afonso Bezerra fault systems formed fractures, silicified and non-silicified faults or deformation bands, affecting mainly the Açu, Jandaíra and Barreiras formations. The petrographic data indicate that the strong silicification resulted in a sealant character of the faults. Paleostress analysis indicates that two stress fields affected the Basin: the first presented N-S-trending compression, occurred from the Neocretaceous to the Miocene; the second stress field presents E-W-trending compression, acts from the Miocene to the present. It was verified once the Afonso Bezerra System Faults was reactivated in periods post-Campanian and affects all post-rift lithostratigraphic units of Potiguar Basin, including Quaternary sedimentary covers. The study about soft-sediment deformation structures indicates that they are similar in morphology and size to modern examples of seismically-induced deformation strutures in coarse sediments. TL and OSL ages indicate that sediment deposition and associated soft-sediment deformation occurred at least six times from ~352 Ka to ~9 Ka. Finally these studies demonstrate how recent is tectonics in the Basin Potiguar