988 resultados para Probabilidade geometrica


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work introduces a new method for environment mapping with three-dimensional information from visual information for robotic accurate navigation. Many approaches of 3D mapping using occupancy grid typically requires high computacional effort to both build and store the map. We introduce an 2.5-D occupancy-elevation grid mapping, which is a discrete mapping approach, where each cell stores the occupancy probability, the height of the terrain at current place in the environment and the variance of this height. This 2.5-dimensional representation allows that a mobile robot to know whether a place in the environment is occupied by an obstacle and the height of this obstacle, thus, it can decide if is possible to traverse the obstacle. Sensorial informations necessary to construct the map is provided by a stereo vision system, which has been modeled with a robust probabilistic approach, considering the noise present in the stereo processing. The resulting maps favors the execution of tasks like decision making in the autonomous navigation, exploration, localization and path planning. Experiments carried out with a real mobile robots demonstrates that this proposed approach yields useful maps for robot autonomous navigation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, we propose a probabilistic mapping method with the mapped environment represented through a modified occupancy grid. The main idea of the proposed method is to allow a mobile robot to construct in a systematic and incremental way the geometry of the underlying space, obtaining at the end a complete environment map. As a consequence, the robot can move in the environment in a safe way, based on a confidence value of data obtained from its perceptive system. The map is represented in a coherent way, according to its sensory data, being these noisy or not, that comes from exterior and proprioceptive sensors of the robot. Characteristic noise incorporated in the data from these sensors are treated by probabilistic modeling in such a way that their effects can be visible in the final result of the mapping process. The results of performed experiments indicate the viability of the methodology and its applicability in the area of autonomous mobile robotics, thus being an contribution to the field

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The idea of considering imprecision in probabilities is old, beginning with the Booles George work, who in 1854 wanted to reconcile the classical logic, which allows the modeling of complete ignorance, with probabilities. In 1921, John Maynard Keynes in his book made explicit use of intervals to represent the imprecision in probabilities. But only from the work ofWalley in 1991 that were established principles that should be respected by a probability theory that deals with inaccuracies. With the emergence of the theory of fuzzy sets by Lotfi Zadeh in 1965, there is another way of dealing with uncertainty and imprecision of concepts. Quickly, they began to propose several ways to consider the ideas of Zadeh in probabilities, to deal with inaccuracies, either in the events associated with the probabilities or in the values of probabilities. In particular, James Buckley, from 2003 begins to develop a probability theory in which the fuzzy values of the probabilities are fuzzy numbers. This fuzzy probability, follows analogous principles to Walley imprecise probabilities. On the other hand, the uses of real numbers between 0 and 1 as truth degrees, as originally proposed by Zadeh, has the drawback to use very precise values for dealing with uncertainties (as one can distinguish a fairly element satisfies a property with a 0.423 level of something that meets with grade 0.424?). This motivated the development of several extensions of fuzzy set theory which includes some kind of inaccuracy. This work consider the Krassimir Atanassov extension proposed in 1983, which add an extra degree of uncertainty to model the moment of hesitation to assign the membership degree, and therefore a value indicate the degree to which the object belongs to the set while the other, the degree to which it not belongs to the set. In the Zadeh fuzzy set theory, this non membership degree is, by default, the complement of the membership degree. Thus, in this approach the non-membership degree is somehow independent of the membership degree, and this difference between the non-membership degree and the complement of the membership degree reveals the hesitation at the moment to assign a membership degree. This new extension today is called of Atanassov s intuitionistic fuzzy sets theory. It is worth noting that the term intuitionistic here has no relation to the term intuitionistic as known in the context of intuitionistic logic. In this work, will be developed two proposals for interval probability: the restricted interval probability and the unrestricted interval probability, are also introduced two notions of fuzzy probability: the constrained fuzzy probability and the unconstrained fuzzy probability and will eventually be introduced two notions of intuitionistic fuzzy probability: the restricted intuitionistic fuzzy probability and the unrestricted intuitionistic fuzzy probability

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Currently, one of the biggest challenges for the field of data mining is to perform cluster analysis on complex data. Several techniques have been proposed but, in general, they can only achieve good results within specific areas providing no consensus of what would be the best way to group this kind of data. In general, these techniques fail due to non-realistic assumptions about the true probability distribution of the data. Based on this, this thesis proposes a new measure based on Cross Information Potential that uses representative points of the dataset and statistics extracted directly from data to measure the interaction between groups. The proposed approach allows us to use all advantages of this information-theoretic descriptor and solves the limitations imposed on it by its own nature. From this, two cost functions and three algorithms have been proposed to perform cluster analysis. As the use of Information Theory captures the relationship between different patterns, regardless of assumptions about the nature of this relationship, the proposed approach was able to achieve a better performance than the main algorithms in literature. These results apply to the context of synthetic data designed to test the algorithms in specific situations and to real data extracted from problems of different fields

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O objetivo neste trabalho foi avaliar economicamente o uso da restrição alimentar qualitativa para suínos machos castrados em terminação sobre o desempenho e as características de carcaça de 60 animais. Dez suínos foram abatidos no início da fase experimental (89,0 ± 4,2 kg) e os demais, alimentados com rações contendo cinco níveis de restrição nutricional qualitativa (0, 5, 10, 15 e 20%), obtidas pela inclusão de casca de arroz finamente moída, até o final do experimento (127,8 ± 2,9 kg). Foram calculados os custos com alimentação durante o período experimental (R$alimento) e estimados os valores de receita bruta de cada carcaça de animais abatidos aos 128 kg (RBsuíno128kg) ou no início do experimento (RBmédia_suíno89kg). A partir destes três dados, foi calculado o resultado líquido (RL) do uso das dietas experimentais (RL = RBsuíno128kg - RBmédia_suíno89kg - R$alimento). Também foram analisadas as variações mensais dos preços do milho, do farelo de soja e do suíno, sendo determinado o preço do milho como o fator de maior impacto sobre a lucratividade do uso da restrição qualitativa. A equação de predição da probabilidade de aumento linear do resultado líquido pelo uso da restrição qualitativa foi determinada em função dos diferentes preços do milho - PM (valor de P RL = 0,392 - 0,625PM, R² = 0,73). Efeito significativo foi observado para preços do milho de cerca de quatro vezes ou mais acima do custo da casca de arroz. Assim, conclui-se que a viabilidade do uso da restrição qualitativa, até o nível de 20%, depende do cenário econômico, mas sobretudo do preço do milho, o principal ingrediente substituído nas rações ao empregar-se a restrição qualitativa, e de sua relação com o custo do resíduo utilizado para diluição energética.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we present a new clustering method that groups up points of a data set in classes. The method is based in a algorithm to link auxiliary clusters that are obtained using traditional vector quantization techniques. It is described some approaches during the development of the work that are based in measures of distances or dissimilarities (divergence) between the auxiliary clusters. This new method uses only two a priori information, the number of auxiliary clusters Na and a threshold distance dt that will be used to decide about the linkage or not of the auxiliary clusters. The number os classes could be automatically found by the method, that do it based in the chosen threshold distance dt, or it is given as additional information to help in the choice of the correct threshold. Some analysis are made and the results are compared with traditional clustering methods. In this work different dissimilarities metrics are analyzed and a new one is proposed based on the concept of negentropy. Besides grouping points of a set in classes, it is proposed a method to statistical modeling the classes aiming to obtain a expression to the probability of a point to belong to one of the classes. Experiments with several values of Na e dt are made in tests sets and the results are analyzed aiming to study the robustness of the method and to consider heuristics to the choice of the correct threshold. During this work it is explored the aspects of information theory applied to the calculation of the divergences. It will be explored specifically the different measures of information and divergence using the Rényi entropy. The results using the different metrics are compared and commented. The work also has appendix where are exposed real applications using the proposed method

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Com o propósito de avaliar a colhedora Penha CLM-350, visando fornecer subsídios aos projetos e usuários, realizou-se o presente trabalho, onde foi estudada a eficácia do sistema espigador, determinando-se a porcentagem de milho colhido para os diversos tratamentos estudados. Definiu-se um parâmetro adimensional U, que é a relação entre a velocidade periférica dos roletes e a velocidade de deslocamento da colhedora. Procurou-se relacionar as diversos tratamentos U, com os valores percentuais de grãos de milho colhidos e perdidos no solo. Os resultados foram estudados através da análise de variância, empregando-se o teste F, aos níveis de 1% e 5% de probabilidade e o teste Tukey para a comparação estatística entre as médias.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pesquisas visando a redução de perdas de grãos e sementes durante o armazenamento têm ocupado destaque em vários países. Dentro deste contexto, a avaliação da eficácia de um índice de qualidade de boa aplicabilidade, com metodologia simples e de resposta imediata visando tomadas rápidas de decisão é de suma importância. O presente trabalho, conduzido no Laboratório de Processamento de Produtos Agrícolas UNESP, Botucatu/SP teve como objetivos: (a) estabelecer correspondência entre o nível de ácidos graxos livres e as classes de vigor em sementes; (b) estabelecer correspondência entre o nível de ácidos graxos livres e a classificação comercial por tipos, em grãos de arroz (Oryza sativa L.). A correspondência entre o nível de ácidos livres e as classes de vigor em sementes foi avaliada utilizando-se sementes envelhecidas artificialmente, obtendo-se assim níveis diferenciados de vigor. A correspondência entre o nível de ácidos graxos livres e a classificação por tipos, em grãos de arroz, foi realizada utilizando-se amostras de arroz com as porcentagens máximas de grãos defeituosos permitidos pela legislação vigente. Utilizou-se a análise de variância de um delineamento inteiramente ao acaso e, para comparação entre médias, aplicou-se o teste de Tukey ao nível de 5% de probabilidade. Por meio dos resultados obtidos, observou-se que o teste de acidez graxa mostrou-se exeqüível para avaliar o vigor de sementes de arroz. Na pesquisa que buscava uma correspondência entre os valores de acidez graxa e classificação comercial, os dados revelaram a tendência do nível de ácidos graxos livres acompanhar a classificação comercial por tipos.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Foi avaliado o consumo energético das operações mecanizadas envolvidas na produção de silagem de planta inteira e silagem de grão úmido de milho, tendo como referência o processamento seco desse cereal. O ensaio foi conduzido na Fazenda Experimental Lageado, pertencente à Faculdade de Ciências Agronômicas, e nas instalações da Faculdade de Medicina Veterinária e Zootecnia - UNESP, localizada no município de Botucatu - SP. O delineamento experimental foi em blocos ao acaso, com parcelas subdivididas no tempo (três épocas de colheita: silagem de planta inteira, silagem de grão úmido e colheita de grãos secos), com 10 repetições. As análises estatísticas foram realizadas por meio do programa ESTAT, pelo teste de média de Tukey, a 5% de probabilidade. A silagem de planta inteira teve o maior consumo de combustível por área. A secagem dos grãos de 15,5% para 13% foi responsável por 87% do gasto de energia por área. A silagem de grão úmido demandou o menor uso de energia por área nas operações mecanizadas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, a performance analysis of transmission schemes employing turbo trellis coded modulation. In general, the performance analysis of such schemes is guided by evaluating the error probability of these schemes. The exact evaluation of this probability is very complex and inefficient from the computational point of view, a widely used alternative is the use of union bound of error probability, because of its easy implementation and computational produce bounds that converge quickly. Since it is the union bound, it should use to expurge some elements of distance spectrum to obtain a tight bound. The main contribution of this work is that the listing proposal is carried out from the puncturing at the level of symbol rather than bit-level as in most works of literature. The main reason for using the symbol level puncturing lies in the fact that the enummerating function of the turbo scheme is obtained directly from complex sequences of signals through the trellis and not indirectly from the binary sequences that require further binary to complex mapping, as proposed by previous works. Thus, algorithms can be applied through matrix from the adjacency matrix, which is obtained by calculating the distances of the complex sequences of the trellis. This work also presents two matrix algorithms for state reduction and the evaluation of the transfer function of this. The results presented in comparisons of the bounds obtained using the proposed technique with some turbo codes of the literature corroborate the proposition of this paper that the expurgated bounds obtained are quite tight and matrix algorithms are easily implemented in any programming software language

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation describes the implementation of a WirelessHART networks simulation module for the Network Simulator 3, aiming for the acceptance of both on the present context of networks research and industry. For validating the module were imeplemented tests for attenuation, packet error rate, information transfer success rate and battery duration per station

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O trabalho teve como objetivo verificar a influência da adubação orgânica em 6 épocas de crescimento na produção de β-ecdisona por plantas de Pfaffia glomerata. O experimento foi conduzido na Fazenda Santo Antonio do Araquá, distrito de Catâneo Ângelo, município de São Manuel, São Paulo, Brasil. Utilizou-se o delineamento de blocos ao acaso, num esquema fatorial 5x6, com quatro repetições, considerando-se 8 plantas úteis por parcela. Os blocos foram constituídos de 6 épocas de crescimento (60, 120, 180, 240, 300 e 360 dias após a germinação) e de 5 doses de esterco de galinha curtido [testemunha (sem adubação), 15, 30, 45 e 60 t ha-1]. Após cada colheita, as raízes das plantas foram secas em estufa com circulação de ar forçada a 40ºC e pesadas para posterior extração do β-ecdisona, seguindo metodologia desenvolvida por Magalhães (2000). Os resultados foram submetidos à análise de variância e ao teste de separação de médias de Scott Knott, todos a 5% de probabilidade. Quando ocorreu interação os resultados foram avaliados usando-se análise de regressão polinominal. O teor de β-ecdisona não foi influenciado pelas doses de adubo e nem pela época do crescimento das plantas. Porém a quantidade total de β-ecdisona por raiz foi influenciada pela época de crescimento, sendo que aos 360 dias após a emergência ocorreu uma maior quantidade do princípio ativo em todos os tratamentos. Apesar de não diferir estatisticamente dos demais tratamentos, aos 360 dias após a emergência das plantas, o tratamento 30 t ha-1 foi o que proporcionou maior quantidade de β-ecdisona.