917 resultados para Localização automática
Resumo:
In this paper is a totally automatic strategy proposed to reduce the complexity of patterns ( vegetation, building, soils etc.) that interact with the object 'road' in color images, thus reducing the difficulty of the automatic extraction of this object. The proposed methodology consists of three sequential steps. In the first step the punctual operator is applied for artificiality index computation known as NandA ( Natural and Artificial). The result is an image whose the intensity attribute is the NandA response. The second step consists in automatically thresholding the image obtained in the previous step, resulting in a binary image. This image usually allows the separation between artificial and natural objects. The third step consists in applying a preexisting road seed extraction methodology to the previous generated binary image. Several experiments carried out with real images made the verification of the potential of the proposed methodology possible. The comparison of the obtained result to others obtained by a similar methodology for road seed extraction from gray level images, showed that the main benefit was the drastic reduction of the computational effort.
Resumo:
The new technique for automatic search of the order parameters and critical properties is applied to several well-know physical systems, testing the efficiency of such a procedure, in order to apply it for complex systems in general. The automatic-search method is combined with Monte Carlo simulations, which makes use of a given dynamical rule for the time evolution of the system. In the problems inves¬tigated, the Metropolis and Glauber dynamics produced essentially equivalent results. We present a brief introduction to critical phenomena and phase transitions. We describe the automatic-search method and discuss some previous works, where the method has been applied successfully. We apply the method for the ferromagnetic fsing model, computing the critical fron¬tiers and the magnetization exponent (3 for several geometric lattices. We also apply the method for the site-diluted ferromagnetic Ising model on a square lattice, computing its critical frontier, as well as the magnetization exponent f3 and the susceptibility exponent 7. We verify that the universality class of the system remains unchanged when the site dilution is introduced. We study the problem of long-range bond percolation in a diluted linear chain and discuss the non-extensivity questions inherent to long-range-interaction systems. Finally we present our conclusions and possible extensions of this work
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The terminological performance of the descriptors representing the Information Science domain in the SIBI/USP Controlled Vocabulary was evaluated in manual, automatic and semi-automatic indexing processes. It can be concluded that, in order to have a better performance (i.e., to adequately represent the content of the corpus), current Information Science descriptors of the SIBi/USP Controlled Vocabulary must be extended and put into context by means of terminological definitions so that information needs of users are fulfilled.
Resumo:
Este trabalho apresenta uma extensão do provador haRVey destinada à verificação de obrigações de prova originadas de acordo com o método B. O método B de desenvolvimento de software abrange as fases de especificação, projeto e implementação do ciclo de vida do software. No contexto da verificação, destacam-se as ferramentas de prova Prioni, Z/EVES e Atelier-B/Click n Prove. Elas descrevem formalismos com suporte à checagem satisfatibilidade de fórmulas da teoria axiomática dos conjuntos, ou seja, podem ser aplicadas ao método B. A checagem de SMT consiste na checagem de satisfatibilidade de fórmulas da lógica de primeira-ordem livre de quantificadores dada uma teoria decidível. A abordagem de checagem de SMT implementada pelo provador automático de teoremas haRVey é apresentada, adotando-se a teoria dos vetores que não permite expressar todas as construções necessárias às especificações baseadas em conjuntos. Assim, para estender a checagem de SMT para teorias dos conjuntos destacam-se as teorias dos conjuntos de Zermelo-Frankel (ZFC) e de von Neumann-Bernays-Gödel (NBG). Tendo em vista que a abordagem de checagem de SMT implementada no haRVey requer uma teoria finita e pode ser estendida para as teorias nãodecidíveis, a teoria NBG apresenta-se como uma opção adequada para a expansão da capacidade dedutiva do haRVey à teoria dos conjuntos. Assim, através do mapeamento dos operadores de conjunto fornecidos pela linguagem B a classes da teoria NBG, obtem-se uma abordagem alternativa para a checagem de SMT aplicada ao método B
Resumo:
Some programs may have their entry data specified by formalized context-free grammars. This formalization facilitates the use of tools in the systematization and the rise of the quality of their test process. This category of programs, compilers have been the first to use this kind of tool for the automation of their tests. In this work we present an approach for definition of tests from the formal description of the entries of the program. The generation of the sentences is performed by taking into account syntactic aspects defined by the specification of the entries, the grammar. For optimization, their coverage criteria are used to limit the quantity of tests without diminishing their quality. Our approach uses these criteria to drive generation to produce sentences that satisfy a specific coverage criterion. The approach presented is based on the use of Lua language, relying heavily on its resources of coroutines and dynamic construction of functions. With these resources, we propose a simple and compact implementation that can be optimized and controlled in different ways, in order to seek satisfaction the different implemented coverage criteria. To make the use of our tool simpler, the EBNF notation for the specification of the entries was adopted. Its parser was specified in the tool Meta-Environment for rapid prototyping
Resumo:
A remoção de inconsistências em um projeto é menos custosa quando realizadas nas etapas iniciais da sua concepção. A utilização de Métodos Formais melhora a compreensão dos sistemas além de possuir diversas técnicas, como a especificação e verificação formal, para identificar essas inconsistências nas etapas iniciais de um projeto. Porém, a transformação de uma especificação formal para uma linguagem de programação é uma tarefa não trivial. Quando feita manualmente, é uma tarefa passível da inserção de erros. O uso de ferramentas que auxiliem esta etapa pode proporcionar grandes benefícios ao produto final a ser desenvolvido. Este trabalho propõe a extensão de uma ferramenta cujo foco é a tradução automática de especificações em CSPm para Handel-C. CSP é uma linguagem de descrição formal adequada para trabalhar com sistemas concorrentes. Handel-C é uma linguagem de programação cujo resultado pode ser compilado diretamente para FPGA's. A extensão consiste no aumento no número de operadores CSPm aceitos pela ferramenta, permitindo ao usuário definir processos locais, renomear canais e utilizar guarda booleana em escolhas externas. Além disto, propomos também a implementação de um protocolo de comunicação que elimina algumas restrições da composição paralela de processos na tradução para Handel-C, permitindo que a comunicação entre múltiplos processos possa ser mapeada de maneira consistente e que a mesma somente ocorra quando for autorizada.
Resumo:
Removing inconsistencies in a project is a less expensive activity when done in the early steps of design. The use of formal methods improves the understanding of systems. They have various techniques such as formal specification and verification to identify these problems in the initial stages of a project. However, the transformation from a formal specification into a programming language is a non-trivial task and error prone, specially when done manually. The aid of tools at this stage can bring great benefits to the final product to be developed. This paper proposes the extension of a tool whose focus is the automatic translation of specifications written in CSPM into Handel-C. CSP is a formal description language suitable for concurrent systems, and CSPM is the notation used in tools support. Handel-C is a programming language whose result can be compiled directly into FPGA s. Our extension increases the number of CSPM operators accepted by the tool, allowing the user to define local processes, to rename channels in a process and to use Boolean guards on external choices. In addition, we also propose the implementation of a communication protocol that eliminates some restrictions on parallel composition of processes in the translation into Handel-C, allowing communication in a same channel between multiple processes to be mapped in a consistent manner and that improper communication in a channel does not ocurr in the generated code, ie, communications that are not allowed in the system specification
Resumo:
Typically Web services contain only syntactic information that describes their interfaces. Due to the lack of semantic descriptions of the Web services, service composition becomes a difficult task. To solve this problem, Web services can exploit the use of ontologies for the semantic definition of service s interface, thus facilitating the automation of discovering, publication, mediation, invocation, and composition of services. However, ontology languages, such as OWL-S, have constructs that are not easy to understand, even for Web developers, and the existing tools that support their use contains many details that make them difficult to manipulate. This paper presents a MDD tool called AutoWebS (Automatic Generation of Semantic Web Services) to develop OWL-S semantic Web services. AutoWebS uses an approach based on UML profiles and model transformations for automatic generation of Web services and their semantic description. AutoWebS offers an environment that provides many features required to model, implement, compile, and deploy semantic Web services
Resumo:
The widespread growth in the use of smart cards (by banks, transport services, and cell phones, etc) has brought an important fact that must be addressed: the need of tools that can be used to verify such cards, so to guarantee the correctness of their software. As the vast majority of cards that are being developed nowadays use the JavaCard technology as they software layer, the use of the Java Modeling Language (JML) to specify their programs appear as a natural solution. JML is a formal language tailored to Java. It has been inspired by methodologies from Larch and Eiffel, and has been widely adopted as the de facto language when dealing with specification of any Java related program. Various tools that make use of JML have already been developed, covering a wide range of functionalities, such as run time and static checking. But the tools existent so far for static checking are not fully automated, and, those that are, do not offer an adequate level of soundness and completeness. Our objective is to contribute to a series of techniques, that can be used to accomplish a fully automated and confident verification of JavaCard applets. In this work we present the first steps to this. With the use of a software platform comprised by Krakatoa, Why and haRVey, we developed a set of techniques to reduce the size of the theory necessary to verify the specifications. Such techniques have yielded very good results, with gains of almost 100% in all tested cases, and has proved as a valuable technique to be used, not only in this, but in most real world problems related to automatic verification
Resumo:
This work an algorithm for fault location is proposed. It contains the following functions: fault detection, fault classification and fault location. Mathematical Morphology is used to process currents obtained in the monitored terminals. Unlike Fourier and Wavelet transforms that are usually applied to fault location, the Mathematical Morphology is a non-linear operation that uses only basic operation (sum, subtraction, maximum and minimum). Thus, Mathematical Morphology is computationally very efficient. For detection and classification functions, the Morphological Wavelet was used. On fault location module the Multiresolution Morphological Gradient was used to detect the traveling waves and their polarities. Hence, recorded the arrival in the two first traveling waves incident at the measured terminal and knowing the velocity of propagation, pinpoint the fault location can be estimated. The algorithm was applied in a 440 kV power transmission system, simulated on ATP. Several fault conditions where studied and the following parameters were evaluated: fault location, fault type, fault resistance, fault inception angle, noise level and sampling rate. The results show that the application of Mathematical Morphology in faults location is very promising
Resumo:
Complex systems have stimulated much interest in the scientific community in the last twenty years. Examples this area are the Domany-Kinzel cellular automaton and Contact Process that are studied in the first chapter this tesis. We determine the critical behavior of these systems using the spontaneous-search method and short-time dynamics (STD). Ours results confirm that the DKCA e CP belong to universality class of Directed Percolation. In the second chapter, we study the particle difusion in two models of stochastic sandpiles. We characterize the difusion through diffusion constant D, definite through in the relation h(x)2i = 2Dt. The results of our simulations, using finite size scalling and STD, show that the diffusion constant can be used to study critical properties. Both models belong to universality class of Conserved Directed Percolation. We also study that the mean-square particle displacement in time, and characterize its dependence on the initial configuration and particle density. In the third chapter, we introduce a computacional model, called Geographic Percolation, to study watersheds, fractals with aplications in various areas of science. In this model, sites of a network are assigned values between 0 and 1 following a given probability distribution, we order this values, keeping always its localization, and search pk site that percolate network. Once we find this site, we remove it from the network, and search for the next that has the network to percole newly. We repeat these steps until the complete occupation of the network. We study the model in 2 and 3 dimension, and compare the bidimensional case with networks form at start real data (Alps e Himalayas)
Resumo:
Because the penetration depth of Ground Penetrating Radar (GPR) signals is very limited in high conductive soils, the usefullness of this method in tropical regions is not yet completly known. The main objective of this researh is to test the usefullness of the method in Brazil. Two typical problems where GPR has been used in Europe and North American were choosed for this test: the first one is to characterize the internal structures of a sand body and the second problem is the localization of old buried pipes lines. The first test was done near the city of São Bento do Norte, in the northern coast of Rio Grande do Norte state, NE Brazil. In this region, there is a sand dune that is migrating very fast in the direction of adjacent settling areas. To characterize the internal structure of the dune and its relationship to the prevailing wind direction, as a preliminary step to understand the dune migration, GPR profiles using the 400 MHz frequency were performed in E-W, N-S, NE-SW, and SE-NW directions over the sand dune intersecting at the top of the dune. The practical resolution of the GPR data is around 30 cm; this was sufficient to distinguish individual foresets inside the dune. After applying the elevation correction to the data, we identified that dips of bedding structures are smallest for the N-S profile, which is perpendicular to the dominant wind direction, largest for the E-W profile, and intermediate for the SW-NE and SE-NW profiles. Foresets in the E-W profile dip with angles varying from 2 to 6 degrees. In the E-W profile, the water table and a horizontal truncation interface separating two generations of dunes were identified, as well as an abrupt directional change in the foreset patterns associated to a lateral contact between two dune generations, the older one extending to the west. The used high frequency of 400 Mhz does not allow a penetration deep enough to map completely these internal contacts. The second test was done near Estreito, a small town near Carnaúbais city, also in Rio Grande do Norte state. In this locality, there are several old pipe lines buried in area covered by plantations where digging should be minimized. Several GPR profiles using the 400 and 200 MHz frequency were performed trying to intercept perpendicularly the possible pipe lines. Because of the high conductivity of the soil, the raw original data can hardly be use to identify the pipe lines. However, after an adequate processing over the 200 MHz profiles, six pipe lines were identified. As a global result of the tests, GPR can be very usefull if the conductivity of the ground is low or, in the case of medium conductivities of the soils, if adequate processing is performed
Resumo:
The northern portion of the Rio Grande do Norte State is characterized by intense coastal dynamics affecting areas with ecosystems of moderate to high environmental sensitivity. In this region are installed the main socioeconomic activities of RN State: salt industry, shrimp farm, fruit industry and oil industry. The oil industry suffers the effects of coastal dynamic action promoting problems such as erosion and exposure of wells and pipelines along the shore. Thus came the improvement of such modifications, in search of understanding of the changes which causes environmental impacts with the purpose of detecting and assessing areas with greater vulnerability to variations. Coastal areas under influence oil industry are highly vulnerable and sensitive in case of accidents involving oil spill in the vicinity. Therefore, it was established the geoenvironmental monitoring of the region with the aim of evaluating the entire coastal area evolution and check the sensitivity of the site on the presence of oil. The goal of this work was the implementation of a computer system that combines the needs of insertion and visualization of thematic maps for the generation of Environmental Vulnerability maps, using techniques of Business Intelligence (BI), from vector information previously stored in the database. The fundamental design interest was to implement a more scalable system that meets the diverse fields of study and make the appropriate system for generating online vulnerability maps, automating the methodology so as to facilitate data manipulation and fast results in cases of real time operational decision-making. In database development a geographic area was established the conceptual model of the selected data and Web system was done using the template database PostgreSQL, PostGis spatial extension, Glassfish Web server and the viewer maps Web environment, the GeoServer. To develop a geographic database it was necessary to generate the conceptual model of the selected data and the Web system development was done using the PostgreSQL database system, its spatial extension PostGIS, the web server Glassfish and GeoServer to display maps in Web
Resumo:
OBJETIVO: Avaliar a prevalência de tracoma em escolares de Botucatu/SP-Brasil e a distribuição espacial dos casos. MÉTODOS: Foi realizado um estudo transversal, em crianças de 7-14 anos, que frequentavam as escolas do ensino fundamental de Botucatu/SP, em novembro/2005. O tamanho da amostra foi estimado em 2.092 crianças, considerando-se a prevalência histórica de 11,2%, aceitando-se erro de estimação de 10% e nível de confiança de 95%. A amostra foi probabilística, ponderada e acrescida de 20%, devido à possível ocorrência de perdas. Examinaram-se 2.692 crianças. O diagnóstico foi clínico, baseado na normatização da Organização Mundial da Saúde (OMS). Para avaliação dos dados espaciais, utilizou-se o programa CartaLinx (v1.2), sendo os setores de demanda escolar digitalizados de acordo com as divisões do planejamento da Secretaria de Educação. Os dados foram analisados estatisticamente, sendo a análise da estrutura espacial dos eventos calculadas usando o programa Geoda. RESULTADOS: A prevalência de tracoma nos escolares de Botucatu foi de 2,9%, tendo sido detectados casos de tracoma folicular. A análise exploratória espacial não permitiu rejeitar a hipótese nula de aleatoriedade (I= -0,45, p>0,05), não havendo setores de demanda significativos. A análise feita para os polígonos de Thiessen também mostrou que o padrão global foi aleatório (I= -0,07; p=0,49). Entretanto, os indicadores locais apontaram um agrupamento do tipo baixo-baixo para um polígono ao norte da área urbana. CONCLUSÃO: A prevalência de tracoma em escolares de Botucatu foi de 2,9%. A análise da distribuição espacial não revelou áreas de maior aglomeração de casos. Embora o padrão global da doença não reproduza as condições socioeconômicas da população, a prevalência mais baixa do tracoma foi encontrada em setores de menor vulnerabilidade social.