978 resultados para Métodos de extração


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The activity of requirements engineering is seen in agile methods as bureaucratic activity making the process less agile. However, the lack of documentation in agile development environment is identified as one of the main challenges of the methodology. Thus, it is observed that there is a contradiction between what agile methodology claims and the result, which occurs in the real environment. For example, in agile methods the user stories are widely used to describe requirements. However, this way of describing requirements is still not enough, because the user stories is an artifact too narrow to represent and detail the requirements. The activities of verifying issues like software context and dependencies between stories are also limited with the use of only this artifact. In the context of requirements engineering there are goal oriented approaches that bring benefits to the requirements documentation, including, completeness of requirements, analysis of alternatives and support to the rationalization of requirements. Among these approaches, it excels the i * modeling technique that provides a graphical view of the actors involved in the system and their dependencies. This work is in the context of proposing an additional resource that aims to reduce this lack of existing documentation in agile methods. Therefore, the objective of this work is to provide a graphical view of the software requirements and their relationships through i * models, thus enriching the requirements in agile methods. In order to do so, we propose a set of heuristics to perform the mapping of the requirements presented as user stories in i * models. These models can be used as a form of documentation in agile environment, because by mapping to i * models, the requirements will be viewed more broadly and with their proper relationships according to the business environment that they will meet

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data clustering is applied to various fields such as data mining, image processing and pattern recognition technique. Clustering algorithms splits a data set into clusters such that elements within the same cluster have a high degree of similarity, while elements belonging to different clusters have a high degree of dissimilarity. The Fuzzy C-Means Algorithm (FCM) is a fuzzy clustering algorithm most used and discussed in the literature. The performance of the FCM is strongly affected by the selection of the initial centers of the clusters. Therefore, the choice of a good set of initial cluster centers is very important for the performance of the algorithm. However, in FCM, the choice of initial centers is made randomly, making it difficult to find a good set. This paper proposes three new methods to obtain initial cluster centers, deterministically, the FCM algorithm, and can also be used in variants of the FCM. In this work these initialization methods were applied in variant ckMeans.With the proposed methods, we intend to obtain a set of initial centers which are close to the real cluster centers. With these new approaches startup if you want to reduce the number of iterations to converge these algorithms and processing time without affecting the quality of the cluster or even improve the quality in some cases. Accordingly, cluster validation indices were used to measure the quality of the clusters obtained by the modified FCM and ckMeans algorithms with the proposed initialization methods when applied to various data sets

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Symbolic Data Analysis (SDA) main aims to provide tools for reducing large databases to extract knowledge and provide techniques to describe the unit of such data in complex units, as such, interval or histogram. The objective of this work is to extend classical clustering methods for symbolic interval data based on interval-based distance. The main advantage of using an interval-based distance for interval-based data lies on the fact that it preserves the underlying imprecision on intervals which is usually lost when real-valued distances are applied. This work includes an approach allow existing indices to be adapted to interval context. The proposed methods with interval-based distances are compared with distances punctual existing literature through experiments with simulated data and real data interval

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A avaliação clínica dos membros inferiores na insuficiência venosa por si só não identifica os sistemas envolvidos ou os níveis anatômicos, sendo necessários exames complementares. Esses exames podem ser invasivos ou não-invasivos. Os invasivos, como flebografia e pressão venosa ambulatória, apesar de terem boa acurácia, trazem desconforto e complicações. Dentre os não-invasivos, destacam-se: Doppler ultra-som de ondas contínuas, fotopletismografia, pletismografia a ar e mapeamento dúplex. O Doppler ultra-som avalia a velocidade do fluxo sangüíneo de maneira indireta. A fotopletismografia avalia o tempo de reenchimento venoso, fornecendo um parâmetro objetivo de quantificação do refluxo venoso. A pletismografia a ar permite quantificar a redução ou não da capacitância, o refluxo e o desempenho da bomba muscular da panturrilha. O dúplex é considerado padrão-ouro dentre os não-invasivos, porque permite uma avaliação quantitativa e qualitativa, fornecendo informações anatômicas e funcionais, dando avaliação mais completa e detalhada dos sistemas venosos profundo e superficial.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJETIVO: Avaliar as diferenças entre três métodos para medida do infarto experimental em ratos, em relação ao método tradicional. MÉTODOS: A área infartada por histologia (AREA), o perímetro interno da cavidade infartada por histologia (PER) e o perímetro interno por ecocardiograma (ECO) foram comparados ao método tradicional (análise histológica das circunferências epicárdicas e endocárdicas da região infartada - CIR). Utilizaram-se ANOVA de medidas repetidas, complementada com o teste de comparações múltiplas de Dunn, o método de concordância de Bland & Altman e o teste de correlação de Spearman. A significância foi p < 0,05. RESULTADOS: Foram analisados dados de 122 animais, após 3 a 6 meses do infarto. Houve diferença na avaliação do tamanho do infarto entre CIR e os outros três métodos (p < 0,001): CIR = 42,4% (35,9-48,8), PER = 50,3% (39,1-57,0), AREA = 27,3% (20,2-34,3), ECO = 46,1% (39,9-52,6). Assim, a medida por área resultou em subestimação de 15% do tamanho do infarto, enquanto as medidas por ecocardiograma e pelo perímetro interno por meio de histologia resultaram em superestimação do tamanho do infarto de 4% e 5%, respectivamente. em relação ao ECO e PER, apesar de a diferença entre os métodos ser de apenas 1,27%, o intervalo de concordância variou de 24,1% a -26,7%, sugerindo baixa concordância entre os métodos. em relação às associações, houve correlações estatisticamente significativas entre: CIR e PER (r = 0,88 e p < 0,0001); CIR e AREA (r = 0,87 e p < 0,0001) e CIR e ECO (r = 0,42 e p < 0,0001). CONCLUSÃO: Na determinação do tamanho do infarto, apesar da alta correlação, houve baixa concordância entre os métodos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The monitoring of Earth dam makes use of visual inspection and instrumentation to identify and characterize the deterioration that compromises the security of earth dams and associated structures. The visual inspection is subjective and can lead to misinterpretation or omission of important information and, some problems are detected too late. The instrumentation are efficient but certain technical or operational issues can cause restrictions. Thereby, visual inspections and instrumentation can lead to a lack of information. Geophysics offers consolidated, low-cost methods that are non-invasive, non-destructive and low cost. They have a strong potential and can be used assisting instrumentation. In the case that a visual inspection and strumentation does not provide all the necessary information, geophysical methods would provide more complete and relevant information. In order to test these theories, geophysical acquisitions were performed using Georadar (GPR), Electric resistivity, Seismic refraction, and Refraction Microtremor (ReMi) on the dike of the dam in Sant Llorenç de Montgai, located in the province of Lleida, 145 km from Barcelona, Catalonia. The results confirmed that the geophysical methods used each responded satisfactorily to the conditions of the earth dike, the anomalies present and the geological features found, such as alluvium and carbonate and evaporite rocks. It has also been confirmed that these methods, when used in an integrated manner, are able to reduce the ambiguities in individual interpretations. They facilitate improved imaging of the interior dikes and of major geological features, thus inspecting the massif and its foundation. Consequently, the results obtained in this study demonstrated that these geophysical methods are sufficiently effective for inspecting earth dams and they are an important tool in the instrumentation and visual inspection of the security of the dams

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The separation methods are reduced applications as a result of the operational costs, the low output and the long time to separate the uids. But, these treatment methods are important because of the need for extraction of unwanted contaminants in the oil production. The water and the concentration of oil in water should be minimal (around 40 to 20 ppm) in order to take it to the sea. Because of the need of primary treatment, the objective of this project is to study and implement algorithms for identification of polynomial NARX (Nonlinear Auto-Regressive with Exogenous Input) models in closed loop, implement a structural identification, and compare strategies using PI control and updated on-line NARX predictive models on a combination of three-phase separator in series with three hydro cyclones batteries. The main goal of this project is to: obtain an optimized process of phase separation that will regulate the system, even in the presence of oil gushes; Show that it is possible to get optimized tunings for controllers analyzing the mesh as a whole, and evaluate and compare the strategies of PI and predictive control applied to the process. To accomplish these goals a simulator was used to represent the three phase separator and hydro cyclones. Algorithms were developed for system identification (NARX) using RLS(Recursive Least Square), along with methods for structure models detection. Predictive Control Algorithms were also implemented with NARX model updated on-line, and optimization algorithms using PSO (Particle Swarm Optimization). This project ends with a comparison of results obtained from the use of PI and predictive controllers (both with optimal state through the algorithm of cloud particles) in the simulated system. Thus, concluding that the performed optimizations make the system less sensitive to external perturbations and when optimized, the two controllers show similar results with the assessment of predictive control somewhat less sensitive to disturbances

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work has as main objective to find mathematical models based on linear parametric estimation techniques applied to the problem of calculating the grow of gas in oil wells. In particular we focus on achieving grow models applied to the case of wells that produce by plunger-lift technique on oil rigs, in which case, there are high peaks in the grow values that hinder their direct measurement by instruments. For this, we have developed estimators based on recursive least squares and make an analysis of statistical measures such as autocorrelation, cross-correlation, variogram and the cumulative periodogram, which are calculated recursively as data are obtained in real time from the plant in operation; the values obtained for these measures tell us how accurate the used model is and how it can be changed to better fit the measured values. The models have been tested in a pilot plant which emulates the process gas production in oil wells

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabalho tem como objetivo o estudo do comportamento assintótico da estatística de Pearson (1900), que é o aparato teórico do conhecido teste qui-quadrado ou teste x2 como também é usualmente denotado. Inicialmente estudamos o comportamento da distribuição da estatística qui-quadrado de Pearson (1900) numa amostra {X1, X2,...,Xn} quando n → ∞ e pi = pi0 , 8n. Em seguida detalhamos os argumentos usados em Billingley (1960), os quais demonstram a convergência em distribuição de uma estatística, semelhante a de Pearson, baseada em uma amostra de uma cadeia de Markov, estacionária, ergódica e com espaço de estados finitos S

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work discusses the application of techniques of ensembles in multimodal recognition systems development in revocable biometrics. Biometric systems are the future identification techniques and user access control and a proof of this is the constant increases of such systems in current society. However, there is still much advancement to be developed, mainly with regard to the accuracy, security and processing time of such systems. In the search for developing more efficient techniques, the multimodal systems and the use of revocable biometrics are promising, and can model many of the problems involved in traditional biometric recognition. A multimodal system is characterized by combining different techniques of biometric security and overcome many limitations, how: failures in the extraction or processing the dataset. Among the various possibilities to develop a multimodal system, the use of ensembles is a subject quite promising, motivated by performance and flexibility that they are demonstrating over the years, in its many applications. Givin emphasis in relation to safety, one of the biggest problems found is that the biometrics is permanently related with the user and the fact of cannot be changed if compromised. However, this problem has been solved by techniques known as revocable biometrics, which consists of applying a transformation on the biometric data in order to protect the unique characteristics, making its cancellation and replacement. In order to contribute to this important subject, this work compares the performance of individual classifiers methods, as well as the set of classifiers, in the context of the original data and the biometric space transformed by different functions. Another factor to be highlighted is the use of Genetic Algorithms (GA) in different parts of the systems, seeking to further maximize their eficiency. One of the motivations of this development is to evaluate the gain that maximized ensembles systems by different GA can bring to the data in the transformed space. Another relevant factor is to generate revocable systems even more eficient by combining two or more functions of transformations, demonstrating that is possible to extract information of a similar standard through applying different transformation functions. With all this, it is clear the importance of revocable biometrics, ensembles and GA in the development of more eficient biometric systems, something that is increasingly important in the present day

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project was developed as a partnership between the Laboratory of Stratigraphical Analyses of the Geology Department of UFRN and the company Millennium Inorganic Chemicals Mineração Ltda. This company is located in the north end of the paraiban coast, in the municipal district of Mataraca. Millennium has as main prospected product, heavy minerals as ilmenita, rutilo and zircon presents in the sands of the dunes. These dunes are predominantly inactive, and overlap the superior portion of Barreiras Formation rocks. The mining happens with the use of a dredge that is emerged at an artificial lake on the dunes. This dredge removes sand dunes of the bottom lake (after it disassembles of the lake borders with water jets) and directs for the concentration plant, through piping where the minerals are then separate. The present work consisted in the acquisition external geometries of the dunes, where in the end a 3D Static Model could be set up of these sedimentary deposits with emphasis in the behavior of the structural top of Barreiras Formation rocks (inferior limit of the deposit). The knowledge of this surface is important in the phase of the plowing planning for the company, because a calculation mistake can do with that the dredge works too close of this limit, taking the risk that fragments can cause obstruction in the dredge generating a financial damage so much in the equipment repair as for the stopped days production. During the field stages (accomplished in 2006 and 2007) topographical techniques risings were used with Total Station and Geodesic GPS as well as shallow geophysical acquisitions with GPR (Ground Penetrating Radar). It was acquired almost 10,4km of topography and 10km of profiles GPR. The Geodesic GPS was used for the data geopositioning and topographical rising of a traverse line with 630m of extension in the stage of 2007. The GPR was shown a reliable method, ecologically clean, fast acquisition and with a low cost in relation to traditional methods as surveys. The main advantage of this equipment is obtain a continuous information to superior surface Barreiras Formation rocks. The static models 3D were elaborated starting from the obtained data being used two specific softwares for visualization 3D: GoCAD 2.0.8 and Datamine. The visualization 3D allows a better understanding of the Barreiras surface behavior as well as it makes possible the execution of several types of measurements, favoring like calculations and allowing that procedures used for mineral extraction is used with larger safety

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A melhoria do processo de extração da fécula de mandioca é um problema bastante discutido entre industriais e pesquisadores. Atualmente, para cada tonelada de raiz de mandioca processada obtêm-se por volta de 250kg de fécula e perde-se no resíduo fibroso cerca de 140kg de fécula que não foi extraída no processamento. Neste trabalho objetivou-se analisar o efeito de uma segunda extração em paralelo e o uso de soluções auxiliares na extração da fécula retida no resíduo fibroso (farelo), visando a melhoria no rendimento industrial. Foram avaliados quatro tratamentos utilizando-se: água (T1), solução de NaOH 0,2% (T2), solução água-álcool 10%v/v (T3) e solução água- Tween 80 0,002% (T4). A partir dos resultados obtidos foi possível concluir que a adição de soluções auxiliares não promoveu diferença significativa (p< 0,05) de extração em relação à água (T1), e que uma nova extração da fécula retida no farelo possibilitaria uma redução de cerca de 20% no teor de amido do farelo, o qual não é extraído no processo convencional.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O gênero Pachyrhizus tem sido estudado como fonte de matéria-prima amilácea devido ao considerável teor de amido nas raízes de suas espécies. Neste trabalho objetivou-se caracterizar raízes de P. ahipa, processar em laboratório para a extração do amido e analisá-lo quanto à composição centesimal, teor de amilose, formato e tamanho de grânulos em microscópio eletrônico de varredura e viscosidade das pastas (RVA). As raízes de P. ahipa apresentaram 18% de massa seca sendo 7,68% amido. O rendimento obtido de amido foi baixo (4,28%), apontando para a necessidade de estudos que melhorem o processo de extração. O produto obtido apresentou 12,3% de umidade, 84% de amido com 13% de amilose e baixos teores de outros componentes (base úmida). A análise em microscópio eletrônico de varredura mostrou grânulos de amido de formas circular e poligonal, com tamanho variando entre 10 e 25mm. O perfil de viscosidade apresentado por este amido mostrou baixa temperatura de pasta (56ºC) e pico de viscosidade a 272 RVU, estando este último valor próximo ao observado para amido de mandioca, sob as mesmas condições. O amido de P. ahipa apresentou ainda, baixa estabilidade da pasta a quente e tendência à retrogradação com o resfriamento.