14 resultados para bag-of-features
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
The research that led to this dissertation adopted a set of scenic/ideological aspects inherent to the productions of the Culture Industry as its object of research. The intellectual output of Theodor W. Adorno and Max Horkheimer underscored the approaches on this subject, since it provides the same set of scenic/ideological features to be explored because, according to the authors, scenes produced by the culture industry are linked to the dominant ideology, since they act in favor of maintaining the status quo. The first objective was the definition this set of features inherent to the scene produced by the culture industry, through the exploration of literature produced by Adorno and Horkheimer, so it was possible to define a set composed of nine elements: Construction of characters as characteristic types; Stereotypes; Naturalization of Stereotyped language; Simplistic playwriting; Reuse dramatic formula; Love and sexuality as themes of plots; Utilization of tragic element; Objetive representation; Approximation of fiction and reality. The second goal was the analysis of scene produced by the culture industry nowadays, so that it was possible to verify if any scenic/ideological aspects indicated by Adorno and Horkheimer in the mid-twentieth century were present among the productions from this beginning of the twenty-first century. Through the analysis of three soap operas produced in Brazil in 2012, it was found that the nine scenic/ideological aspects as indicated by Adorno and Horkheimer appeared in the observed productions. Additionally, a new scenic/ideological feature, not indicated by Adorno and Horkheimer is present: the merchandising
Resumo:
On Rio Grande do Norte northern coast the process of sediment transport are intensely controlled by wind and sea (waves and currents) action, causing erosion and shoreline morphological instability. Due to the importance of such coastal zone it was realized the multi-spectral mapping and physical-chemical characterization of mudflats and mangroves aiming to support the mitigating actions related to the containment of the erosive process on the oil fields of Macau and Serra installed at the study area. The multi-spectral bands of 2000 and 2008 LANDSAT 5 TM images were submitted on the several digital processing steps and RGB color compositions integrating spectral bands and Principal Components. Such processing methodology was important to the mapping of different units on surface, together with field works. It was possible to make an analogy of the spectral characteristics of wetlands with vegetations areas (mangrove), showing the possibility to make a restoration of this area, contributing with the environmental monitoring of that ecosystem. The maps of several units were integrated in GIS environment at 1:60,000 scale, including the classification of features according to the presence or absence of vegetation cover. Thus, the strategy of methodology established that there are 10.13 km2 at least of sandy-muddy and of these approximately 0.89 km2 with the possibility to be used in a reforestation of typical flora of mangrove. The physical-chemical characterization showed areas with potential to introduce local species of mangrove and they had a pH above neutral with a mean of 8.4. The characteristic particle size is sand in the fine fractions, the high levels of carbonate, organic matter and major and trace element in general are concentrated where the sediment had the less particles size, showing the high correlation that those elements have with smaller particles of sediment. The application of that methodological strategy is relevant to the better understanding of features behavior and physical-chemical data of sediment samples collected on field allow the analysis of efficiency/capability of sandy-muddy to reforestation with local mangrove species for mitigation of the erosive action and coastal processes on the areas occupied by the oil industry
Sistema de detecção e isolamento de falhas em sistemas dinâmicos baseado em identificação paramétrica
Resumo:
The present research aims at contributing to the area of detection and diagnosis of failure through the proposal of a new system architecture of detection and isolation of failures (FDI, Fault Detection and Isolation). The proposed architecture presents innovations related to the way the physical values monitored are linked to the FDI system and, as a consequence, the way the failures are detected, isolated and classified. A search for mathematical tools able to satisfy the objectives of the proposed architecture has pointed at the use of the Kalman Filter and its derivatives EKF (Extended Kalman Filter) and UKF (Unscented Kalman Filter). The use of the first one is efficient when the monitored process presents a linear relation among its physical values to be monitored and its out-put. The other two are proficient in case this dynamics is no-linear. After that, a short comparative of features and abilities in the context of failure detection concludes that the UFK system is a better alternative than the EKF one to compose the architecture of the FDI system proposed in case of processes of no-linear dynamics. The results shown in the end of the research refer to the linear and no-linear industrial processes. The efficiency of the proposed architecture may be observed since it has been applied to simulated and real processes. To conclude, the contributions of this thesis are found in the end of the text
Resumo:
With the rapid growth of databases of various types (text, multimedia, etc..), There exist a need to propose methods for ordering, access and retrieve data in a simple and fast way. The images databases, in addition to these needs, require a representation of the images so that the semantic content characteristics are considered. Accordingly, several proposals such as the textual annotations based retrieval has been made. In the annotations approach, the recovery is based on the comparison between the textual description that a user can make of images and descriptions of the images stored in database. Among its drawbacks, it is noted that the textual description is very dependent on the observer, in addition to the computational effort required to describe all the images in database. Another approach is the content based image retrieval - CBIR, where each image is represented by low-level features such as: color, shape, texture, etc. In this sense, the results in the area of CBIR has been very promising. However, the representation of the images semantic by low-level features is an open problem. New algorithms for the extraction of features as well as new methods of indexing have been proposed in the literature. However, these algorithms become increasingly complex. So, doing an analysis, it is natural to ask whether there is a relationship between semantics and low-level features extracted in an image? and if there is a relationship, which descriptors better represent the semantic? which leads us to a new question: how to use descriptors to represent the content of the images?. The work presented in this thesis, proposes a method to analyze the relationship between low-level descriptors and semantics in an attempt to answer the questions before. Still, it was observed that there are three possibilities of indexing images: Using composed characteristic vectors, using parallel and independent index structures (for each descriptor or set of them) and using characteristic vectors sorted in sequential order. Thus, the first two forms have been widely studied and applied in literature, but there were no records of the third way has even been explored. So this thesis also proposes to index using a sequential structure of descriptors and also the order of these descriptors should be based on the relationship that exists between each descriptor and semantics of the users. Finally, the proposed index in this thesis revealed better than the traditional approachs and yet, was showed experimentally that the order in this sequence is important and there is a direct relationship between this order and the relationship of low-level descriptors with the semantics of the users
Resumo:
The development and refinement of techniques that make simultaneous localization and mapping (SLAM) for an autonomous mobile robot and the building of local 3-D maps from a sequence of images, is widely studied in scientific circles. This work presents a monocular visual SLAM technique based on extended Kalman filter, which uses features found in a sequence of images using the SURF descriptor (Speeded Up Robust Features) and determines which features can be used as marks by a technique based on delayed initialization from 3-D straight lines. For this, only the coordinates of the features found in the image and the intrinsic and extrinsic camera parameters are avaliable. Its possible to determine the position of the marks only on the availability of information of depth. Tests have shown that during the route, the mobile robot detects the presence of characteristics in the images and through a proposed technique for delayed initialization of marks, adds new marks to the state vector of the extended Kalman filter (EKF), after estimating the depth of features. With the estimated position of the marks, it was possible to estimate the updated position of the robot at each step, obtaining good results that demonstrate the effectiveness of monocular visual SLAM system proposed in this paper
Resumo:
We propose a robotics simulation platform, named S-Educ, developed specifically for application in educational robotics, which can be used as an alternative or in association with robotics kits in classes involving the use of robotics. In the usually known approach, educational robotics uses robotics kits for classes which generally include interdisciplinary themes. The idea of this work is not to replace these kits, but to use the developed simulator as an alternative, where, for some reason, the traditional kits cannot be used, or even to use the platform in association with these kits. To develop the simulator, initially, we conducted research in the literature on the use of robotic simulators and robotic kits, facing the education sector, from which it was possible to define a set of features considered important for creating such a tool. Then, on the software development phase, the simulator S-Educ was implemented, taking into account the requirements and features defined in the design phase. Finally, to validate the platform, several tests were conducted with teachers, students and lay adults, in which it was used the simulator S-Educ, to evaluate its use in educational robotics classes. The results show that robotic simulator allows a reduction of financial costs, facilitate testing and reduce robot damage inherent to its use, in addition to other advantages. Furthermore, as a contribution to the community, the proposed tool can be used to increase adhesion of Brazilian schools to the methodologies of educational robotics or to robotics competitions
Resumo:
Traditional applications of feature selection in areas such as data mining, machine learning and pattern recognition aim to improve the accuracy and to reduce the computational cost of the model. It is done through the removal of redundant, irrelevant or noisy data, finding a representative subset of data that reduces its dimensionality without loss of performance. With the development of research in ensemble of classifiers and the verification that this type of model has better performance than the individual models, if the base classifiers are diverse, comes a new field of application to the research of feature selection. In this new field, it is desired to find diverse subsets of features for the construction of base classifiers for the ensemble systems. This work proposes an approach that maximizes the diversity of the ensembles by selecting subsets of features using a model independent of the learning algorithm and with low computational cost. This is done using bio-inspired metaheuristics with evaluation filter-based criteria
Resumo:
In this work, we propose a two-stage algorithm for real-time fault detection and identification of industrial plants. Our proposal is based on the analysis of selected features using recursive density estimation and a new evolving classifier algorithm. More specifically, the proposed approach for the detection stage is based on the concept of density in the data space, which is not the same as probability density function, but is a very useful measure for abnormality/outliers detection. This density can be expressed by a Cauchy function and can be calculated recursively, which makes it memory and computational power efficient and, therefore, suitable for on-line applications. The identification/diagnosis stage is based on a self-developing (evolving) fuzzy rule-based classifier system proposed in this work, called AutoClass. An important property of AutoClass is that it can start learning from scratch". Not only do the fuzzy rules not need to be prespecified, but neither do the number of classes for AutoClass (the number may grow, with new class labels being added by the on-line learning process), in a fully unsupervised manner. In the event that an initial rule base exists, AutoClass can evolve/develop it further based on the newly arrived faulty state data. In order to validate our proposal, we present experimental results from a level control didactic process, where control and error signals are used as features for the fault detection and identification systems, but the approach is generic and the number of features can be significant due to the computationally lean methodology, since covariance or more complex calculations, as well as storage of old data, are not required. The obtained results are significantly better than the traditional approaches used for comparison
Resumo:
This study had to aimed to characterize the sediments of shallow continental shelf and realize the mapping of features visible for satellite images by using remote sensing techniques, digital image processing and analysis of bathymetry between Maxaranguape and Touros - RN. The study s area is located in the continental shallow shelf of Rio Grande do Norte, Brazil, and is part of the Environmental Protection Area (APA) of Coral Reefs. A total of 1186 sediment samples were collected using a dredge type van veen and positioning of the vessel was made out with the aid of a Garmin 520s. The samples were treated In the laboratory to analyze particle size of the sediment, concentration of calcium carbonate and biogenic composition. The digital images from the Landsat-5 TM were used to mapping of features. This stage was used the band 1 (0,45-1,52 μm) where the image were georeferenced, and then adjusting the histogram, giving a better view of feature bottom and contacts between different types of bottom. The results obtained from analysis of the sediment showed that the sediments of the continental shelf east of RN have a dominance of carbonate facies and a sand-gravelly bottom because the region is dominated by biogenic sediments, that are made mainly of calcareous algae. The bedform types identified and morphological features found were validated by bathymetric data and sediment samples examined. From the results obtained a division for the shelf under study is suggested, these regions being subdivided, in well characterized: (1) Turbid Zone, (2) Coral Patch Reefs Zone, (3) Mixed Sediments Carbonates Zone, ( 4) Algae Fouling Zone, (5) Alignment Rocky Zone, (6) Sand Waves Field (7) Deposit siliciclastic sands
Resumo:
Entrepreneurs are individuals who can transform economic and social realities by promoting development, so it became important tools in generating externalities in regions where they operate. In Brazil, 59.9% of new ventures do not reach the fourth year of life, the mortality rate of new ventures is high. The causes of mortality are numerous, and within the behavioral aspects, one is the locus of control. This study determines the degree of association between internal locus of control and achieving business success of entrepreneurs in Rio Grande do Norte who participated in the workshop EMPRETEC. The approach that studies the behavior entrepreneurs agreed that there are psychological characteristics associated with a set of values, attitudes and needs that determine the behavior and induce the entrepreneur to achieve success. Among these features is the locus of control, a skill that individuals must identify in their actions, or lack of them, the causes of their successes and failures. The locus is external when the individual attributes to factors outside themselves as causes of their results, and is built in when you can identify the actions that led to success. We surveyed 223 entrepreneurs statewide who answered the questionnaire for assessing the scale of locus of control, selfassessment questionnaire of entrepreneurial characteristics of EMPRETEC and a questionnaire assessing the business success. 71.9% were identified as having success. Among the behavioral characteristics strongest in the group of entrepreneurs are setting goals and commitment. Was found for locus of control mean value of 7.35, confidence interval between 7.05 and 7.66. Showing that the locus of control group is predominantly internal. We also found a correlation between the locus and commitment, between setting goals and commitment; calculated risks and information search; search of information and commitment, and between commitment and independence and self confidence. Dependence was not identified among the set of features and business success, determining the absence of an ideal profile. However, logistic regression significant association was found indicating that the smaller the individual's locus of control increased the likelihood of it achieving business success
Resumo:
The research that led to this dissertation adopted a set of scenic/ideological aspects inherent to the productions of the Culture Industry as its object of research. The intellectual output of Theodor W. Adorno and Max Horkheimer underscored the approaches on this subject, since it provides the same set of scenic/ideological features to be explored because, according to the authors, scenes produced by the culture industry are linked to the dominant ideology, since they act in favor of maintaining the status quo. The first objective was the definition this set of features inherent to the scene produced by the culture industry, through the exploration of literature produced by Adorno and Horkheimer, so it was possible to define a set composed of nine elements: Construction of characters as characteristic types; Stereotypes; Naturalization of Stereotyped language; Simplistic playwriting; Reuse dramatic formula; Love and sexuality as themes of plots; Utilization of tragic element; Objetive representation; Approximation of fiction and reality. The second goal was the analysis of scene produced by the culture industry nowadays, so that it was possible to verify if any scenic/ideological aspects indicated by Adorno and Horkheimer in the mid-twentieth century were present among the productions from this beginning of the twenty-first century. Through the analysis of three soap operas produced in Brazil in 2012, it was found that the nine scenic/ideological aspects as indicated by Adorno and Horkheimer appeared in the observed productions. Additionally, a new scenic/ideological feature, not indicated by Adorno and Horkheimer is present: the merchandising
Resumo:
On Rio Grande do Norte northern coast the process of sediment transport are intensely controlled by wind and sea (waves and currents) action, causing erosion and shoreline morphological instability. Due to the importance of such coastal zone it was realized the multi-spectral mapping and physical-chemical characterization of mudflats and mangroves aiming to support the mitigating actions related to the containment of the erosive process on the oil fields of Macau and Serra installed at the study area. The multi-spectral bands of 2000 and 2008 LANDSAT 5 TM images were submitted on the several digital processing steps and RGB color compositions integrating spectral bands and Principal Components. Such processing methodology was important to the mapping of different units on surface, together with field works. It was possible to make an analogy of the spectral characteristics of wetlands with vegetations areas (mangrove), showing the possibility to make a restoration of this area, contributing with the environmental monitoring of that ecosystem. The maps of several units were integrated in GIS environment at 1:60,000 scale, including the classification of features according to the presence or absence of vegetation cover. Thus, the strategy of methodology established that there are 10.13 km2 at least of sandy-muddy and of these approximately 0.89 km2 with the possibility to be used in a reforestation of typical flora of mangrove. The physical-chemical characterization showed areas with potential to introduce local species of mangrove and they had a pH above neutral with a mean of 8.4. The characteristic particle size is sand in the fine fractions, the high levels of carbonate, organic matter and major and trace element in general are concentrated where the sediment had the less particles size, showing the high correlation that those elements have with smaller particles of sediment. The application of that methodological strategy is relevant to the better understanding of features behavior and physical-chemical data of sediment samples collected on field allow the analysis of efficiency/capability of sandy-muddy to reforestation with local mangrove species for mitigation of the erosive action and coastal processes on the areas occupied by the oil industry