911 resultados para Simulated rain
Resumo:
Chromosomes of the South American geckos Gymnodactylus amarali and G. geckoides from open and dry areas of the Cerrado and Caatinga biomes in Brazil, respectively, were studied for the first time, after conventional and AgNOR staining, CBG- and RBG-banding, and FISH with telomeric sequences. Comparative analyses between the karyotypes of open areas and the previously studied Atlantic forest species G. darwinii were also performed. The chromosomal polymorphisms detected in populations of G. amarali from the states of Goias and Tocantins is the result of centric fusions (2n = 38, 39 and 40), suggesting a differentiation from a 2n = 40 ancestral karyotype and the presence of supernumerary chromosomes. The CBG- and RBG-banding patterns of the Bs are described. G. geckoides has 40 chromosomes with gradually decreasing sizes, but it is distinct from the 2n = 40 karyotypes of G. amarali and G. darwinii due to occurrence of pericentric inversions or centromere repositioning. NOR location seems to be a marker for Gymnodactylus, as G. amarali and G. geckoides share a medium-sized subtelocentric NOR-bearing pair, while G. darwinii has NORs at the secondary constriction of the long arm of pair 1. The comparative analyses indicate a non-random nature of the Robertsonian rearrangements in the genus Gymnodactylus. Copyright (C) 2010 S. Karger AG, Basel
Resumo:
Croton campanulatus, a new species from southeastern Brazil in the states of Minas Gerais and Rio de Janeiro, is here described and illustrated. Morphological data indicate that this species belongs to Croton section Cleodora based on its arborescent habit, pistillate flowers with imbricate sepals, reduced petals, and multifid styles that are fused at the base.
Resumo:
The corrosion resistance of Ti and Ti-6Al-4V was investigated through electrochemical impedance spectroscopy, EIS, potentiodynamic polarisation curves and UV-Vis spectrophotometry. The tests were done in Hank solution at 25 degrees C and 37 degrees C. The EIS measurements were done at the open circuit potential at specific immersion times. An increase of the resistance as a function of the immersion time was observed, for Ti (at 25 degrees C and 37 degrees C), and for Ti-6Al-4V (at 25 degrees C), which was interpreted as the formation and growth of a passive film on the metallic surfaces. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The volatile oil composition and anti-acetyl cholinesterase activity were analyzed in two specimens of Marlierea racemosa growing in different areas of the Atlantic Rain Forest (Cananeia and Caraguatatuba, SP, Brazil). Component identifications were performed by GUMS and their acetyl cholinesterase inhibitory activity was measured through colorimetric analysis. The major constituent in both specimens was spathulenol (25.1 % in Cananeia and 31.9% in Caraguatatuba). However, the first one also presented monoterpenes (41.2%), while in the Carguatatuba plants, this class was not detected. The oils from the plants collected in Cananeia were able to inhibit the acetyl cholinesterase activity by LIP to 75%, but for oils from the other locality the maximal inhibition achieved was 35%. These results suggested that the monoterpenes are more effective in the inhibition of acetyl cholinesterase activity than sesquiterpenes as these compounds are present in higher amounts in the M. racemosa plants collected in Cananeia.
Resumo:
This Thesis Work will concentrate on a very interesting problem, the Vehicle Routing Problem (VRP). In this problem, customers or cities have to be visited and packages have to be transported to each of them, starting from a basis point on the map. The goal is to solve the transportation problem, to be able to deliver the packages-on time for the customers,-enough package for each Customer,-using the available resources- and – of course - to be so effective as it is possible.Although this problem seems to be very easy to solve with a small number of cities or customers, it is not. In this problem the algorithm have to face with several constraints, for example opening hours, package delivery times, truck capacities, etc. This makes this problem a so called Multi Constraint Optimization Problem (MCOP). What’s more, this problem is intractable with current amount of computational power which is available for most of us. As the number of customers grow, the calculations to be done grows exponential fast, because all constraints have to be solved for each customers and it should not be forgotten that the goal is to find a solution, what is best enough, before the time for the calculation is up. This problem is introduced in the first chapter: form its basics, the Traveling Salesman Problem, using some theoretical and mathematical background it is shown, why is it so hard to optimize this problem, and although it is so hard, and there is no best algorithm known for huge number of customers, why is it a worth to deal with it. Just think about a huge transportation company with ten thousands of trucks, millions of customers: how much money could be saved if we would know the optimal path for all our packages.Although there is no best algorithm is known for this kind of optimization problems, we are trying to give an acceptable solution for it in the second and third chapter, where two algorithms are described: the Genetic Algorithm and the Simulated Annealing. Both of them are based on obtaining the processes of nature and material science. These algorithms will hardly ever be able to find the best solution for the problem, but they are able to give a very good solution in special cases within acceptable calculation time.In these chapters (2nd and 3rd) the Genetic Algorithm and Simulated Annealing is described in details, from their basis in the “real world” through their terminology and finally the basic implementation of them. The work will put a stress on the limits of these algorithms, their advantages and disadvantages, and also the comparison of them to each other.Finally, after all of these theories are shown, a simulation will be executed on an artificial environment of the VRP, with both Simulated Annealing and Genetic Algorithm. They will both solve the same problem in the same environment and are going to be compared to each other. The environment and the implementation are also described here, so as the test results obtained.Finally the possible improvements of these algorithms are discussed, and the work will try to answer the “big” question, “Which algorithm is better?”, if this question even exists.
Resumo:
Before signing electronic contracts, a rational agent should estimate the expected utilities of these contracts and calculate the violation risks related to them. In order to perform such pre-signing procedures, this agent has to be capable of computing a policy taking into account the norms and sanctions in the contracts. In relation to this, the contribution of this work is threefold. First, we present the Normative Markov Decision Process, an extension of the Markov Decision Process for explicitly representing norms. In order to illustrate the usage of our framework, we model an example in a simulated aerospace aftermarket. Second, we specify an algorithm for identifying the states of the process which characterize the violation of norms. Finally, we show how to compute policies with our framework and how to calculate the risk of violating the norms in the contracts by adopting a particular policy.
Resumo:
The June issue of the Chronicle of Higher Education showcased as its cover story the blaring headlines, “Should the Internet Be Scrapped?” Did this surprise anyone? If it did, you must not have been paying attention. Over the last decade, the Internet, the Web—yes, yes, I know the terms are technically not synonymous but have become so in usage—has become increasingly useless as a scholarly tool. The CHE story discussed the obvious problems: spam, viruses, unreliable connections, not to mention unreliable information, disinformation and even misinformation.
Resumo:
Os algoritmos baseados no paradigma Simulated Annealing e suas variações são atualmente usados de forma ampla na resolução de problemas de otimização de larga escala. Esta popularidade é resultado da estrutura extremamente simples e aparentemente universal dos algoritmos, da aplicabilidade geral e da habilidade de fornecer soluções bastante próximas da ótima. No início da década de 80, Kirkpatrick e outros apresentaram uma proposta de utilização dos conceitos de annealing (resfriamento lento e controlado de sólidos) em otimização combinatória. Esta proposta considera a forte analogia entre o processo físico de annealing e a resolução de problemas grandes de otimização combinatória. Simulated Annealing (SA) é um denominação genérica para os algoritmos desenvolvidos com base nesta proposta. Estes algoritmos combinam técnicas de busca local e de randomização. O objetivo do presente trabalho é proporcionar um entendimento das características do Simulated Annealing e facilitar o desenvolvimento de algoritmos com estas características. Assim, é apresentado como Simulated Annealing e suas variações estão sendo utilizados na resolução de problemas de otimização combinatória, proposta uma formalização através de um método de desenvolvimento de algoritmos e analisados aspectos de complexidade. O método de desenvolvimento especifica um programa abstrato para um algoritmo Simulated Annealing seqüencial, identifica funções e predicados que constituem os procedimentos deste programa abstrato e estabelece axiomas que permitem a visualização das propriedades que estes procedimentos devem satisfazer. A complexidade do Simulated Annealing é analisada a partir do programa abstrato desenvolvido e de seus principais procedimentos, permitindo o estabelecimento de uma equação genérica para a complexidade. Esta equação genérica é aplicável aos algoritmos desenvolvidos com base no método proposto. Uma prova de correção é apresentada para o programa abstrato e um código exemplo é analisado com relação aos axiomas estabelecidos. O estabelecimento de axiomas tem como propósito definir uma semântica para o algoritmo, o que permite a um desenvolvedor analisar a correção do código especificado para um algoritmo levando em consideração estes axiomas. O trabalho foi realizado a partir de um estudo introdutório de otimização combinatória, de técnicas de resolução de problemas, de um levantamento histórico do uso do Simulated Annealing, das variações em torno do modelo e de embasamentos matemáticos documentados. Isto permitiu identificar as características essenciais dos algoritmos baseados no paradigma, analisar os aspectos relacionados com estas características, como as diferentes formas de realizar uma prescrição de resfriamento e percorrer um espaço de soluções, e construir a fundamentação teórica genérica proposta.