1000 resultados para Algoritmos de filtragem


Relevância:

10.00% 10.00%

Publicador:

Resumo:

O material apresenta políticas de escalonamento de processos e threads. O escalonamento de processos (ou Escalonamento do processador) trata da decisão sobre qual processo será executado em um determinado instante e por qual processador. O material apresenta também algoritmos de escalonamento relevantes, incluindo exemplos de algoritmos preemptivos e não-preemptivos, objetivos e critérios do escalonamento e diferentes tipos de escalonamentos: Escalonamento FIFO (first-in first-out), Escalonamento circular RR (Round-Robin ), Escalonamento SPF (Shortest Process First), Escalonamento SRT (Shortest Remaining Time), Escalonamento FSS (Fair Share Scheduling), Escalonamento de tempo real, Escalonamento de threads Java – JVM, Escalonamento no Windows XP e UNIX.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O presente trabalho analisa soluções de controlo não-linear baseadas em Redes Neuronais e apresenta a sua aplicação a um caso prático, desde o algoritmo de treino até à implementação física em hardware. O estudo inicial do estado da arte da utilização das Redes Neuronais para o controlo leva à proposta de soluções iterativas para a definição da arquitectura das mesmas e para o estudo das técnicas de Regularização e Paragem de Treino Antecipada, através dos Algoritmos Genéticos e à proposta de uma forma de validação dos modelos obtidos. Ao longo da tese são utilizadas quatro malhas para o controlo baseado em modelos, uma das quais uma contribuição original, e é implementado um processo de identificação on-line, tendo por base o algoritmo de treino Levenberg-Marquardt e a técnica de Paragem de Treino Antecipada que permite o controlo de um sistema, sem necessidade de recorrer ao conhecimento prévio das suas características. O trabalho é finalizado com um estudo do hardware comercial disponível para a implementação de Redes Neuronais e com o desenvolvimento de uma solução de hardware utilizando uma FPGA. De referir que o trabalho prático de teste das soluções apresentadas é realizado com dados reais provenientes de um forno eléctrico de escala reduzida.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nesta dissertação de mestrado é desenvolvido um sistema de replicação de circuitos integrados digitais (combinatórios e sequenciais), por observação do seu normal funcionamento. O sistema desenvolvido carateriza-se pela capacidade de extrair e descrever na linguagem VHDL o comportamento de um circuito integrado digital em funcionamento, utilizando técnicas não invasivas e automatizadas, suportado por um vasto conjunto de algoritmos de aquisição e análise de dados. O sistema desenvolvido assenta em dois módulos principais: um módulo de software que consiste numa plataforma de algoritmos de análise, controlo e gestão do sistema (alojada num computador) e um módulo de aquisição de dados (hardware) que consiste num circuito capaz de realizar as medições necessárias para o funcionamento do sistema, comandado pelo módulo de software. A comunicação entre os dois módulos é efetuada via porta série. Os algoritmos desenvolvidos realizam uma análise da correspondência entre entradas e saídas procurando aplicar uma aproximação a um circuito combinatório se possível, caso contrário são utilizados métodos heurísticos para efetuar uma aproximação a um circuito sequencial através de uma máquina de estados. Entradas ou saídas constantes são previamente identificados e excluídos do processo de análise, para efeitos de simplificação. Os resultados obtidos demonstram que é possível replicar o comportamento observado em circuitos digitais (combinatórios e sequenciais) desde que o número de amostras recolhidas seja adequado. Verifica-se ainda que o método desenvolvido replica a funcionalidade do circuito integrado nas condições onde o circuito está inserido.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O controlo de banda larga é um conceito importante quando lidamos com redes de larga escala. Os ISPs precisam de garantir disponibilidade e qualidade de serviço a todos os clientes, enquanto garantem que a rede como um todo não fica mais lenta. Para garantir isto, é necessário que os ISPs recolham dados de tráfego, analisem-nos e usem-nos para definir a velocidade de banda larga de cada cliente. A NOS Madeira implementou, durante vários anos, um sistema semelhante. No entanto, este sistema encontrava-se obsoleto, sendo necessário construir um novo, totalmente de raíz. Entre as limitações encontrava-se a impossibilidade de alterar os algoritmos de análise de tráfego, fraca integração com os serviços de gestão de rede da NOS Madeira e reduzida escalabilidade e modularidade. O sistema IP Network Usage Accounting é a resposta a estes problemas. Este projeto foca-se no desenvolvimento do subsistema Accounting System, o segundo dos três subsistemas que compõem o sistema IP Network Usage Accounting. Este subsistema, implementado com sucesso e atualmente em produção na NOS Madeira, é responsável por analisar os dados referidos acima e usar os resultados dessa análise para direcionar a disponibilidade de banda larga, de acordo com o uso da rede de cada cliente.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A matemÆtica discreta Ø um dos ramos mais antigos da matemÆtica. Nos tempos mais recentes sofreu grandes avanos em especial na teoria dos grafos, a qual tornou-se numa poderosa ferramenta de anÆlise para entender e dar soluªo a vÆrios tipos de problemas complexos. O objectivo deste trabalho Ø contribuir para a obtenªo de possveis relaıes entre assuntos que partida poderamos pensar que sªo dspares (quando na realidade nªo o sªo), como coloraªo, planaridade e a existŒncia de matching em grafos. Esta dissertaªo Ø um trabalho de natureza reexiva, sobre a teoria dos grafos onde a ideia principal passa por questionarmos e discutirmos alguns temas pertinentes, deniıes e teoremas relacionando sempre com a planaridade dos grafos. DesenvolveremosumraciocnioecriaremosargumentosquefundamentemaexistŒncia de uma relaªo entre este tema e a coloraªo de grafos e a existŒncia de matching em grafos, utilizando exemplos e estabelecendo relaıes de causa e consequŒncia, deduzindo assim as respetivas conclusıes. Por vezes, os grafos nªo planares podem conter um aspeto visual um pouco complexo, devido aos vÆrios cruzamentos entre as suas arestas, originando assim um certo desencorajamento em utilizÆ-los como ferramenta para a soluªo de vÆrios problemas, quer sejam bÆsicos do quotidiano, ou mais complexos das mais vastas Æreas ligadas investigaªo. Um dos propsitos deste trabalho passa por desmisticar esta ideia e provar que existem muitas deniıes, propriedades, teoremas e algoritmos que podem ser aplicados em qualquer tipo de grafos, independentement da sua planaridade.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analisa a indexação dos documentos da Biblioteca Setorial de Química através de um estudo informétrico na ferramenta de busca do Sistema de Bibliotecas (SISBI) da Universidade Federal do Rio Grande do Norte (UFRN). Descreve um estudo informétrico realizado na ferramenta de busca do SISBI da UFRN, sendo feita direcionada para os documentos Biblioteca Setorial de Química. Enfoca a importância do estudo informétrico para analisar a recuperação da informação relacionada à indexação. Aborda a relação da informetria com a indexação e recuperação da informação com o intuito de que o profissional bibliotecário seja mais analítico e tenha uma compreensão maior do campo da ciência da informação. Utiliza de uma metodologia de consultas de assuntos pré-definidos, sendo feita uma filtragem de forma quantitativa com o objetivo de verificar se a indexação dos documentos está sendo satisfatória para a recuperação da informação na ferramenta de busca do SISBI. Constata a relevância dos documentos em cada busca sua precisão e revocação, mostrando que para haver uma boa recuperação da informação tem que a indexação seja feita de forma que não haja ambiguidade com outros termos, com isso mostra a importância de sempre ser feito um estudo informétrico com que verifique a recuperação da informação para que possa sempre haver uma melhora na mesma.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The central interest of this thesis is to comprehend how the public action impels the formation and transformation of the tourist destinies. The research was based on the premise that the public actions are the result of the mediation process of state and non-state actors considered important in a section, which interact aiming for prevailing their interests and world visions above the others. The case of Porto de Galinhas beach, in Pernambuco, locus of the investigation of this thesis, allowed the analysis of a multiplicity of actors on the formation and implementation of local actions toward the development of the tourism between the years 1970 and 2010, as well as permitted the comprehension of the construction of the referential on the interventions made. This thesis, of a qualitative nature, has as theoretical support the cognitive approach of analysis of the public policies developed in France, and it has as main exponents the authors Bruno Jobert and Pierre Muller. This choice was made by the emphasis on the cognitive and normative factors of the politics, which aspects are not very explored in the studies of public policies in Brazil. As the source of the data collection, documental, bibliographic and field researches were utilized to the (re)constitution of the formation and transformation in the site concerned. The analysis techniques applied were the content and the documental analysis. To trace the public action referential, it started by the characterization of the touristic section frontiers and the creation of images by the main international body: the World Tourism Organization, of which analysis of the minutes of the meetings underscored guidelines to the member countries, including Brazil, which compounds the global-sectorial reference of the section. As from the analysis of the evolution of the tourism in the country, was identified that public policies in Brazil passed by transformations in their organization over the years, indicating changes in the referential that guided the interventions. These guidelines and transformations were identified in the construction of the tourist destination of Porto de Galinhas, of which data was systematized and presented in four historical periods, in which were discussed the values, the standard, the algorithms, the images and the important mediators. It has been revealed that the State worked in different roles in the decades analyzed in local tourism. From the 1990s, however, new actors were inserted in the formulation and implementation of policies developed, especially for local hotelkeepers. These, through their association, establishes a leadership relation in the local touristic section, thereby, they could set their hegemony and spread their own interest. The leadership acquired by a group of actors, in the case of Porto de Galinhas, does not mean that trade within the industry were neutralized, but that there is a cognitive framework that confronts the actors involved. In spite of the advances achieved by the work of the mediators in the last decades, that resulted in an amplification and diversification of the activity in the area, as well as the consolidation at the beach, as a tourist destiny of national standout, the position of the place is instable, concerned to the competitiveness, once that there is an situation of social and environmental unsustainability

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present study we elaborated algorithms by using concepts from percolation theory which analyze the connectivity conditions in geological models of petroleum reservoirs. From the petrophysical parameters such as permeability, porosity, transmittivity and others, which may be generated by any statistical process, it is possible to determine the portion of the model with more connected cells, what the interconnected wells are, and the critical path between injector and source wells. This allows to classify the reservoir according to the modeled petrophysical parameters. This also make it possible to determine the percentage of the reservoir to which each well is connected. Generally, the connected regions and the respective minima and/or maxima in the occurrence of the petrophysical parameters studied constitute a good manner to characterize a reservoir volumetrically. Therefore, the algorithms allow to optimize the positioning of wells, offering a preview of the general conditions of the given model s connectivity. The intent is not to evaluate geological models, but to show how to interpret the deposits, how their petrophysical characteristics are spatially distributed, and how the connections between the several parts of the system are resolved, showing their critical paths and backbones. The execution of these algorithms allows us to know the properties of the model s connectivity before the work on reservoir flux simulation is started

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The seismic processing technique has the main objective to provide adequate picture of geological structures from subsurface of sedimentary basins. Among the key steps of this process is the enhancement of seismic reflections by filtering unwanted signals, called seismic noise, the improvement of signals of interest and the application of imaging procedures. The seismic noise may appear random or coherent. This dissertation will present a technique to attenuate coherent noise, such as ground roll and multiple reflections, based on Empirical Mode Decomposition method. This method will be applied to decompose the seismic trace into Intrinsic Mode Functions. These functions have the properties of being symmetric, with local mean equals zero and the same number of zero-crossing and extremes. The developed technique was tested on synthetic and real data, and the results were considered encouraging

Relevância:

10.00% 10.00%

Publicador:

Resumo:

From their early days, Electrical Submergible Pumping (ESP) units have excelled in lifting much greater liquid rates than most of the other types of artificial lift and developed by good performance in wells with high BSW, in onshore and offshore environments. For all artificial lift system, the lifetime and frequency of interventions are of paramount importance, given the high costs of rigs and equipment, plus the losses coming from a halt in production. In search of a better life of the system comes the need to work with the same efficiency and security within the limits of their equipment, this implies the need for periodic adjustments, monitoring and control. How is increasing the prospect of minimizing direct human actions, these adjustments should be made increasingly via automation. The automated system not only provides a longer life, but also greater control over the production of the well. The controller is the brain of most automation systems, it is inserted the logic and strategies in the work process in order to get you to work efficiently. So great is the importance of controlling for any automation system is expected that, with better understanding of ESP system and the development of research, many controllers will be proposed for this method of artificial lift. Once a controller is proposed, it must be tested and validated before they take it as efficient and functional. The use of a producing well or a test well could favor the completion of testing, but with the serious risk that flaws in the design of the controller were to cause damage to oil well equipment, many of them expensive. Given this reality, the main objective of the present work is to present an environment for evaluation of fuzzy controllers for wells equipped with ESP system, using a computer simulator representing a virtual oil well, a software design fuzzy controllers and a PLC. The use of the proposed environment will enable a reduction in time required for testing and adjustments to the controller and evaluated a rapid diagnosis of their efficiency and effectiveness. The control algorithms are implemented in both high-level language, through the controller design software, such as specific language for programming PLCs, Ladder Diagram language.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Originally aimed at operational objectives, the continuous measurement of well bottomhole pressure and temperature, recorded by permanent downhole gauges (PDG), finds vast applicability in reservoir management. It contributes for the monitoring of well performance and makes it possible to estimate reservoir parameters on the long term. However, notwithstanding its unquestionable value, data from PDG is characterized by a large noise content. Moreover, the presence of outliers within valid signal measurements seems to be a major problem as well. In this work, the initial treatment of PDG signals is addressed, based on curve smoothing, self-organizing maps and the discrete wavelet transform. Additionally, a system based on the coupling of fuzzy clustering with feed-forward neural networks is proposed for transient detection. The obtained results were considered quite satisfactory for offshore wells and matched real requisites for utilization

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the growth of energy consumption worldwide, conventional reservoirs, the reservoirs called "easy exploration and production" are not meeting the global energy demand. This has led many researchers to develop projects that will address these needs, companies in the oil sector has invested in techniques that helping in locating and drilling wells. One of the techniques employed in oil exploration process is the reverse time migration (RTM), in English, Reverse Time Migration, which is a method of seismic imaging that produces excellent image of the subsurface. It is algorithm based in calculation on the wave equation. RTM is considered one of the most advanced seismic imaging techniques. The economic value of the oil reserves that require RTM to be localized is very high, this means that the development of these algorithms becomes a competitive differentiator for companies seismic processing. But, it requires great computational power, that it still somehow harms its practical success. The objective of this work is to explore the implementation of this algorithm in unconventional architectures, specifically GPUs using the CUDA by making an analysis of the difficulties in developing the same, as well as the performance of the algorithm in the sequential and parallel version

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper artificial neural network (ANN) based on supervised and unsupervised algorithms were investigated for use in the study of rheological parameters of solid pharmaceutical excipients, in order to develop computational tools for manufacturing solid dosage forms. Among four supervised neural networks investigated, the best learning performance was achieved by a feedfoward multilayer perceptron whose architectures was composed by eight neurons in the input layer, sixteen neurons in the hidden layer and one neuron in the output layer. Learning and predictive performance relative to repose angle was poor while to Carr index and Hausner ratio (CI and HR, respectively) showed very good fitting capacity and learning, therefore HR and CI were considered suitable descriptors for the next stage of development of supervised ANNs. Clustering capacity was evaluated for five unsupervised strategies. Network based on purely unsupervised competitive strategies, classic "Winner-Take-All", "Frequency-Sensitive Competitive Learning" and "Rival-Penalize Competitive Learning" (WTA, FSCL and RPCL, respectively) were able to perform clustering from database, however this classification was very poor, showing severe classification errors by grouping data with conflicting properties into the same cluster or even the same neuron. On the other hand it could not be established what was the criteria adopted by the neural network for those clustering. Self-Organizing Maps (SOM) and Neural Gas (NG) networks showed better clustering capacity. Both have recognized the two major groupings of data corresponding to lactose (LAC) and cellulose (CEL). However, SOM showed some errors in classify data from minority excipients, magnesium stearate (EMG) , talc (TLC) and attapulgite (ATP). NG network in turn performed a very consistent classification of data and solve the misclassification of SOM, being the most appropriate network for classifying data of the study. The use of NG network in pharmaceutical technology was still unpublished. NG therefore has great potential for use in the development of software for use in automated classification systems of pharmaceutical powders and as a new tool for mining and clustering data in drug development

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work articulates a discussion about the role of the Non-Governmental Organizations (NGOs) in the social policies scenario, privileging the basic education directed to the poor in Brazil, in the period from 1992 to 2002. It is a relevant theme, particularly due to the importance of the NGOs in both the national and international scope actions. The study assumes that the NGOs are instruments of control (social control) that filter the social demands towards the State, specially regarding the basic education directed to the poor. It also discusses the process of acknowledgement e expansion of the NGOs as a result of the political and economic conjuncture leading to the State reform, which had an impact on the field of social policies. A close examination of these complex relations was only possible through the understanding of the third sector establishment, having the NGOs as the main means. In this movement, relevant information of the reality are considered in order to delimit the extents of this phenomenon, with a brief reference to its origins, to the marks of its conjunctural relations, registering the multiple faces of these Organizations and the constutive elements of the debate among different visions regarding the third sector, having the NGOs as a part of its composition. With this approach, documents and publications by the NGOs and the government are researched. Based on this material, the purposes announced by these organizations are analyzed, considering the Brazilian social, political and economic conjuncture. The State of Maranhão has been chosen as an example of this context, due to the high levels of poverty and low school performance, and also given that a great many NGOs actions are more and more being held in that location in order to promote social policies. It has been concluded that the NGOs help strengthen the theses that show the decrease in the State responsibilities on free, public and quality education, and that such principle is being negotiated through the partners actions

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O objetivo do trabalho foi comparar os valores de fibra em detergente neutro (FDN) e os de fibra em detergente ácido (FDA) obtidos com o equipamento ANKOM e pelo método convencional (Van Soest). No primeiro ensaio foram analisados cinco materiais diferentes (cana-de-açúcar, capim-braquiária, silagem de milho, polpa cítrica e fezes bovina) e testados quatro tipos de saquinhos para filtragem de amostra. O delineamento experimental foi em blocos casualizados, em um fatorial 5x4 (cinco materiais diferentes e quatro tipos de saquinhos de filtragem), com três repetições. As médias obtidas foram comparadas àquelas obtidas com método convencional. No segundo ensaio procurou-se avaliar o efeito da quantidade de amostra por saquinho (0,5; 0,8 e 1,0 g) sobre os teores de FDN e FDA em três tipos de alimentos, utilizando o ANKOM. Neste ensaio, o delineamento experimental foi em blocos casualizados em esquema fatorial 3x3 (três alimentos x três quantidades), com três repetições. Os tipos de saquinhos de filtragem utilizados não influenciaram os teores de FDN nos diferentes alimentos, com exceção das fezes, cujos saquinhos de náilon resultaram em concentrações de FDN inferiores. Não houve diferença entre os valores de FDN e FDA, obtidos pelo equipamento ANKOM ou pelo convencional, para os alimentos estudados, com exceção da polpa cítrica, cujo valor médio de FDA pelo ANKOM foi inferior àquele obtido pelo método convencional. A quantidade não exerceu efeito sobre a concentração de FDN dos alimentos analisados no equipamento ANKOM.