950 resultados para Modelos fuzzy set


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In last decades, neural networks have been established as a major tool for the identification of nonlinear systems. Among the various types of networks used in identification, one that can be highlighted is the wavelet neural network (WNN). This network combines the characteristics of wavelet multiresolution theory with learning ability and generalization of neural networks usually, providing more accurate models than those ones obtained by traditional networks. An extension of WNN networks is to combine the neuro-fuzzy ANFIS (Adaptive Network Based Fuzzy Inference System) structure with wavelets, leading to generate the Fuzzy Wavelet Neural Network - FWNN structure. This network is very similar to ANFIS networks, with the difference that traditional polynomials present in consequent of this network are replaced by WNN networks. This paper proposes the identification of nonlinear dynamical systems from a network FWNN modified. In the proposed structure, functions only wavelets are used in the consequent. Thus, it is possible to obtain a simplification of the structure, reducing the number of adjustable parameters of the network. To evaluate the performance of network FWNN with this modification, an analysis of network performance is made, verifying advantages, disadvantages and cost effectiveness when compared to other existing FWNN structures in literature. The evaluations are carried out via the identification of two simulated systems traditionally found in the literature and a real nonlinear system, consisting of a nonlinear multi section tank. Finally, the network is used to infer values of temperature and humidity inside of a neonatal incubator. The execution of such analyzes is based on various criteria, like: mean squared error, number of training epochs, number of adjustable parameters, the variation of the mean square error, among others. The results found show the generalization ability of the modified structure, despite the simplification performed

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of wireless telecommunication in the last years has been great. It has been taking academics to conceive new ideas and techniques. Their aims are to increase the capacity and the quality of the system s services. Cells that are smaller every time, frequencies that are every time higher and environments that get more and more complex, all those facts deserve more accurate models the propagation prediction techniques are inserted in this context and results with a merger of error that is compatible with the next generations of communication systems. The objective of this Work is to present results of a propagation measurement campaign, aiming at pointing the characteristics of the mobile systems covering in the city of Natal (state of Rio Grande do Norte, Brazil). A mobile laboratory was set up, using the infra-structure available and frequently used by ANATEL. The measures were taken in three different areas: one characterized by high buildings, high relief, presence of trees and towers of different highs. These areas covered the city s central zone, a suburban / rural zone and a section of coast surrounded by sand dunes. It is important to highlight that the analysis was made taking into consideration the actual reality of cellular systems with covering ranges by reduced cells, with the intent of causing greater re-use of frequencies and greater capacity of telephone traffic. The predominance of telephone traffic by cell in the city of Natal occurs within a range inferior to 3 (three) km from the Radio-Base Station. The frequency band used was 800 MHz, corresponding to the control channels of the respective sites, which adopt the FSK modulation technique. This Dissertation starts by presenting a general vision of the models used for predicting propagation. Then, there is a description of the methodology used in the measuring, which were done using the same channels of control of the cellular system. The results obtained were compared with many existing prediction models, and some adaptations were developed by using regression techniques trying to obtain the most optimized solutions. Furthermore, according to regulations from the old Brazilian Holding Telebrás, a minimum covering of 90% of a determined previously area, in 90% of the time, must be obeyed when implanting cellular systems. For such value to be reached, considerations and studies involving the specific environment that is being covered are important. The objective of this work is contribute to this aspect

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents the specification and the implementation of a language of Transformations in definite Models specification MOF (Meta Object Facility) of OMG (Object Management Group). The specification uses a boarding based on rules ECA (Event-Condition-Action) and was made on the basis of a set of scenes of use previously defined. The Parser Responsible parser for guaranteeing that the syntactic structure of the language is correct was constructed with the tool JavaCC (Java Compiler Compiler) and the description of the syntax of the language was made with EBNF (Extended Backus-Naur Form). The implementation is divided in three parts: the creation of the interpretative program properly said in Java, the creation of an executor of the actions specified in the language and its integration with the type of considered repository (generated for tool DSTC dMOF). A final prototype was developed and tested in the scenes previously defined

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we study the survival cure rate model proposed by Yakovlev (1993) that are considered in a competing risk setting. Covariates are introduced for modeling the cure rate and we allow some covariates to have missing values. We consider only the cases by which the missing covariates are categorical and implement the EM algorithm via the method of weights for maximum likelihood estimation. We present a Monte Carlo simulation experiment to compare the properties of the estimators based on this method with those estimators under the complete case scenario. We also evaluate, in this experiment, the impact in the parameter estimates when we increase the proportion of immune and censored individuals among the not immune one. We demonstrate the proposed methodology with a real data set involving the time until the graduation for the undergraduate course of Statistics of the Universidade Federal do Rio Grande do Norte

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present residual analysis techniques to assess the fit of correlated survival data by Accelerated Failure Time Models (AFTM) with random effects. We propose an imputation procedure for censored observations and consider three types of residuals to evaluate different model characteristics. We illustrate the proposal with the analysis of AFTM with random effects to a real data set involving times between failures of oil well equipment

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the emergence of other forms of artificial lift, sucker rod pumping systems remains hegemonic because of its flexibility of operation and lower investment cost compared to other lifting techniques developed. A successful rod pumping sizing necessarily passes through the supply of estimated flow and the controlled wear of pumping equipment used in the mounted configuration. However, the mediation of these elements is particularly challenging, especially for most designers dealing with this work, which still lack the experience needed to get good projects pumping in time. Even with the existence of various computer applications on the market in order to facilitate this task, they must face a grueling process of trial and error until you get the most appropriate combination of equipment for installation in the well. This thesis proposes the creation of an expert system in the design of sucker rod pumping systems. Its mission is to guide a petroleum engineer in the task of selecting a range of equipment appropriate to the context provided by the characteristics of the oil that will be raised to the surface. Features such as the level of gas separation, presence of corrosive elements, possibility of production of sand and waxing are taken into account in selecting the pumping unit, sucker-rod strings and subsurface pump and their operation mode. It is able to approximate the inferente process in the way of human reasoning, which leads to results closer to those obtained by a specialist. For this, their production rules were based on the theory of fuzzy sets, able to model vague concepts typically present in human reasoning. The calculations of operating parameters of the pumping system are made by the API RP 11L method. Based on information input, the system is able to return to the user a set of pumping configurations that meet a given design flow, but without subjecting the selected equipment to an effort beyond that which can bear

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The segmentation of an image aims to subdivide it into constituent regions or objects that have some relevant semantic content. This subdivision can also be applied to videos. However, in these cases, the objects appear in various frames that compose the videos. The task of segmenting an image becomes more complex when they are composed of objects that are defined by textural features, where the color information alone is not a good descriptor of the image. Fuzzy Segmentation is a region-growing segmentation algorithm that uses affinity functions in order to assign to each element in an image a grade of membership for each object (between 0 and 1). This work presents a modification of the Fuzzy Segmentation algorithm, for the purpose of improving the temporal and spatial complexity. The algorithm was adapted to segmenting color videos, treating them as 3D volume. In order to perform segmentation in videos, conventional color model or a hybrid model obtained by a method for choosing the best channels were used. The Fuzzy Segmentation algorithm was also applied to texture segmentation by using adaptive affinity functions defined for each object texture. Two types of affinity functions were used, one defined using the normal (or Gaussian) probability distribution and the other using the Skew Divergence. This latter, a Kullback-Leibler Divergence variation, is a measure of the difference between two probability distributions. Finally, the algorithm was tested in somes videos and also in texture mosaic images composed by images of the Brodatz album

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The activity of requirements engineering is seen in agile methods as bureaucratic activity making the process less agile. However, the lack of documentation in agile development environment is identified as one of the main challenges of the methodology. Thus, it is observed that there is a contradiction between what agile methodology claims and the result, which occurs in the real environment. For example, in agile methods the user stories are widely used to describe requirements. However, this way of describing requirements is still not enough, because the user stories is an artifact too narrow to represent and detail the requirements. The activities of verifying issues like software context and dependencies between stories are also limited with the use of only this artifact. In the context of requirements engineering there are goal oriented approaches that bring benefits to the requirements documentation, including, completeness of requirements, analysis of alternatives and support to the rationalization of requirements. Among these approaches, it excels the i * modeling technique that provides a graphical view of the actors involved in the system and their dependencies. This work is in the context of proposing an additional resource that aims to reduce this lack of existing documentation in agile methods. Therefore, the objective of this work is to provide a graphical view of the software requirements and their relationships through i * models, thus enriching the requirements in agile methods. In order to do so, we propose a set of heuristics to perform the mapping of the requirements presented as user stories in i * models. These models can be used as a form of documentation in agile environment, because by mapping to i * models, the requirements will be viewed more broadly and with their proper relationships according to the business environment that they will meet

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data clustering is applied to various fields such as data mining, image processing and pattern recognition technique. Clustering algorithms splits a data set into clusters such that elements within the same cluster have a high degree of similarity, while elements belonging to different clusters have a high degree of dissimilarity. The Fuzzy C-Means Algorithm (FCM) is a fuzzy clustering algorithm most used and discussed in the literature. The performance of the FCM is strongly affected by the selection of the initial centers of the clusters. Therefore, the choice of a good set of initial cluster centers is very important for the performance of the algorithm. However, in FCM, the choice of initial centers is made randomly, making it difficult to find a good set. This paper proposes three new methods to obtain initial cluster centers, deterministically, the FCM algorithm, and can also be used in variants of the FCM. In this work these initialization methods were applied in variant ckMeans.With the proposed methods, we intend to obtain a set of initial centers which are close to the real cluster centers. With these new approaches startup if you want to reduce the number of iterations to converge these algorithms and processing time without affecting the quality of the cluster or even improve the quality in some cases. Accordingly, cluster validation indices were used to measure the quality of the clusters obtained by the modified FCM and ckMeans algorithms with the proposed initialization methods when applied to various data sets

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Image segmentation is the process of subdiving an image into constituent regions or objects that have similar features. In video segmentation, more than subdividing the frames in object that have similar features, there is a consistency requirement among segmentations of successive frames of the video. Fuzzy segmentation is a region growing technique that assigns to each element in an image (which may have been corrupted by noise and/or shading) a grade of membership between 0 and 1 to an object. In this work we present an application that uses a fuzzy segmentation algorithm to identify and select particles in micrographs and an extension of the algorithm to perform video segmentation. Here, we treat a video shot is treated as a three-dimensional volume with different z slices being occupied by different frames of the video shot. The volume is interactively segmented based on selected seed elements, that will determine the affinity functions based on their motion and color properties. The color information can be extracted from a specific color space or from three channels of a set of color models that are selected based on the correlation of the information from all channels. The motion information is provided into the form of dense optical flows maps. Finally, segmentation of real and synthetic videos and their application in a non-photorealistic rendering (NPR) toll are presented

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We considered prediction techniques based on models of accelerated failure time with random e ects for correlated survival data. Besides the bayesian approach through empirical Bayes estimator, we also discussed about the use of a classical predictor, the Empirical Best Linear Unbiased Predictor (EBLUP). In order to illustrate the use of these predictors, we considered applications on a real data set coming from the oil industry. More speci - cally, the data set involves the mean time between failure of petroleum-well equipments of the Bacia Potiguar. The goal of this study is to predict the risk/probability of failure in order to help a preventive maintenance program. The results show that both methods are suitable to predict future failures, providing good decisions in relation to employment and economy of resources for preventive maintenance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project was developed as a partnership between the Laboratory of Stratigraphical Analyses of the Geology Department of UFRN and the company Millennium Inorganic Chemicals Mineração Ltda. This company is located in the north end of the paraiban coast, in the municipal district of Mataraca. Millennium has as main prospected product, heavy minerals as ilmenita, rutilo and zircon presents in the sands of the dunes. These dunes are predominantly inactive, and overlap the superior portion of Barreiras Formation rocks. The mining happens with the use of a dredge that is emerged at an artificial lake on the dunes. This dredge removes sand dunes of the bottom lake (after it disassembles of the lake borders with water jets) and directs for the concentration plant, through piping where the minerals are then separate. The present work consisted in the acquisition external geometries of the dunes, where in the end a 3D Static Model could be set up of these sedimentary deposits with emphasis in the behavior of the structural top of Barreiras Formation rocks (inferior limit of the deposit). The knowledge of this surface is important in the phase of the plowing planning for the company, because a calculation mistake can do with that the dredge works too close of this limit, taking the risk that fragments can cause obstruction in the dredge generating a financial damage so much in the equipment repair as for the stopped days production. During the field stages (accomplished in 2006 and 2007) topographical techniques risings were used with Total Station and Geodesic GPS as well as shallow geophysical acquisitions with GPR (Ground Penetrating Radar). It was acquired almost 10,4km of topography and 10km of profiles GPR. The Geodesic GPS was used for the data geopositioning and topographical rising of a traverse line with 630m of extension in the stage of 2007. The GPR was shown a reliable method, ecologically clean, fast acquisition and with a low cost in relation to traditional methods as surveys. The main advantage of this equipment is obtain a continuous information to superior surface Barreiras Formation rocks. The static models 3D were elaborated starting from the obtained data being used two specific softwares for visualization 3D: GoCAD 2.0.8 and Datamine. The visualization 3D allows a better understanding of the Barreiras surface behavior as well as it makes possible the execution of several types of measurements, favoring like calculations and allowing that procedures used for mineral extraction is used with larger safety

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objetivou-se comparar modelos de regressão aleatória com diferentes estruturas de variância residual, a fim de se buscar a melhor modelagem para a característica tamanho da leitegada ao nascer (TLN). Utilizaram-se 1.701 registros de TLN, que foram analisados por meio de modelo animal, unicaracterística, de regressão aleatória. As regressões fixa e aleatórias foram representadas por funções contínuas sobre a ordem de parto, ajustadas por polinômios ortogonais de Legendre de ordem 3. Para averiguar a melhor modelagem para a variância residual, considerou-se a heterogeneidade de variância por meio de 1 a 7 classes de variância residual. O modelo geral de análise incluiu grupo de contemporâneo como efeito fixo; os coeficientes de regressão fixa para modelar a trajetória média da população; os coeficientes de regressão aleatória do efeito genético aditivo-direto, do comum-de-leitegada e do de ambiente permanente de animal; e o efeito aleatório residual. O teste da razão de verossimilhança, o critério de informação de Akaike e o critério de informação bayesiano de Schwarz apontaram o modelo que considerou homogeneidade de variância como o que proporcionou melhor ajuste aos dados utilizados. As herdabilidades obtidas foram próximas a zero (0,002 a 0,006). O efeito de ambiente permanente foi crescente da 1ª (0,06) à 5ª (0,28) ordem, mas decrescente desse ponto até a 7ª ordem (0,18). O comum-de-leitegada apresentou valores baixos (0,01 a 0,02). A utilização de homogeneidade de variância residual foi mais adequada para modelar as variâncias associadas à característica tamanho da leitegada ao nascer nesse conjunto de dado.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dados de crescimento e reprodução de 573 vacas da raça Guzerá, nascidas entre 1961 e 1985, na Fazenda Canoas, em Curvelo, MG, foram analisados com o objetivo de estabelecer um padrão médio de crescimento, mediante o uso de um modelo matemático que se ajuste adequadamente aos dados. Os modelos Brody, Bertalanffy, Logístico, Gompertz e Richards foram ajustados aos dados de peso/idade, coletados até 1992, e comparados quanto à qualidade de ajustamento. Os pesos assintóticos e as taxas de maturidade estimadas foram, respectivamente: para o modelo Brody, 464,49 e 0,046; para o Bertalanffy, 453,18 e 0,065; para o Logístico, 447,05 e 0,085; para o Gompertz, 449,89 e 0,075, e para o Richards, 458,26 e 0,055. O modelo Richards apresentou dificuldades computacionais para ajustamento aos dados. Os outros modelos se revelaram adequados para descrever o crescimento nesses animais, apresentando pequenas variações na qualidade de ajustamento, de acordo com os critérios utilizados. O modelo Bertalanffy foi escolhido para representar a curva média de crescimento dos animais, por apresentar um ajustamento superior no conjunto dos critérios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A estimativa de conforto térmico na avicultura moderna é importante para que sistemas de climatização possam ser acionados no tempo correto, diminuindo perdas e aumentando rendimentos. Embora a literatura corrente apresente alguns índices de conforto térmico, que são aplicados para essa estimativa, estes são baseados apenas em condições do ambiente térmico e não consideram fatores importantes inerentes aos animais, tais como genética e capacidade de aclimatação, provendo, geralmente, uma estimativa inadequada do conforto térmico das aves. Este trabalho desenvolveu o Índice Fuzzy de Conforto Térmico (IFCT), com o intuito de estimar o conforto térmico de frangos de corte, considerando que o mecanismo usado pelas aves para perda de calor em ambientes fora da zona termoneutra é a vasodilatação periférica, que aumenta a temperatura superficial, e que pode ser usada como indicador do estado de conforto. O IFCT foi desenvolvido a partir de dois experimentos, que proporcionaram 108 cenários ambientais diferentes. Foram usadas imagens termográficas infravermelhas, para o registro dos dados de temperaturas superficiais das penas e da pele, e o grau de empenamento das aves. Para os mesmos cenários de ambiente térmico observados nos experimentos, foram comparados os resultados obtidos usando o IFCT e o Índice de Temperatura e Umidade (ITU). Os resultados validaram o IFCT para a estimativa do conforto térmico de frangos de corte, sendo específico na estimativa de condições de perigo térmico, usual em alojamentos em países de clima tropical. Essa característica é desejável em modelos que estimem o bem-estar térmico de frangos de corte, pois situações classificadas como perigo acarretam no dispêndio de recursos para evitar perdas produtivas.