984 resultados para software updating
Resumo:
In exploration seismology, the geologic target of oil and gas reservoir in complex medium request the high accuracy image of the structure and lithology of the medium. So the study of the prestack image and the elastic inversion of seismic wave in the complex medium come to the leading edge. The seismic response measured at the surface carries two fundamental pieces of information: the propagation effects of the medium and the reflections from the different layer boundaries in the medium. The propagation represent the low-wavenumber component of the medium, it is so-called the trend or macro layering, whereas the reflections represent the high-wavenumber component of the medium, it is called the detailed or fine layering. The result of migration velocity analysis is the resolution of the low-wavenumber component of the medium, but the prestack elastic inversion provided the resolution of the high-wavvenumber component the medium. In the dissertation, the two aspects about the migration velocity estimation and the elastic inversion have been studied.Firstly, any migration velocity analysis methods must include two basic elements: the criterion that tell us how to know whether the model parameters are correct and the updating that tell us how to update the model parameters when they are incorrect, which are effected on the properties and efficiency of the velocity estimation method. In the dissertation, a migration velocity analysis method based on the CFP technology has been presented in which the strategy of the top-down layer stripping approach are adapted to avoid the difficult of the selecting reduce .The proposed method has a advantage that the travel time errors obtained from the DTS panel are defined directly in time which is the difference with the method based on common image gather in which the residual curvature measured in depth should be converted to travel time errors.In the proposed migration velocity analysis method, the four aspects have been improved as follow:? The new parameterization of velocity model is provided in which the boundaries of layers are interpolated with the cubic spline of the control location and the velocity with a layer may change along with lateral position but the value is calculated as a segmented linear function of the velocity of the lateral control points. The proposed parameterization is suitable to updating procedure.? The analytical formulas to represent the travel time errors and the model parameters updates in the t-p domain are derived under local lateral homogeneous. The velocity estimations are iteratively computed as parametric inversion. The zero differential time shift in the DTS panel for each layer show the convergence of the velocity estimation.? The method of building initial model using the priori information is provided to improve the efficiency of velocity analysis. In the proposed method, Picking interesting events in the stacked section to define the boundaries of the layers and the results of conventional velocity analysis are used to define the velocity value of the layers? An interactive integrate software environment with the migration velocity analysis and prestack migration is built.The proposed method is firstly used to the synthetic data. The results of velocity estimation show both properties and efficiency of the velocity estimation are very good.The proposed method is also used to the field data which is the marine data set. In this example, the prestack and poststack depth migration of the data are completed using the different velocity models built with different method. The comparison between them shows that the model from the proposed method is better and improves obviously the quality of migration.In terms of the theoretical method of expressing a multi-variable function by products of single-variable functions which is suggested by Song Jian (2001), the separable expression of one-way wave operator has been studied. A optimization approximation with separable expression of the one-way wave operator is presented which easily deal with the lateral change of velocity in space and wave number domain respectively and has good approach accuracy. A new prestack depth migration algorithm based on the optimization approximation separable expression is developed and used to testing the results of velocity estimation.Secondly, according to the theory of the seismic wave reflection and transmission, the change of the amplitude via the incident angle is related to the elasticity of medium in the subsurface two-side. In the conventional inversion with poststack datum, only the information of the reflection operator at the zero incident angles can be used. If the more robust resolutions are requested, the amplitudes of all incident angles should be used.A natural separable expression of the reflection/transmission operator is represented, which is the sum of the products of two group functions. One group function vary with phase space whereas other group function is related to elastic parameters of the medium and geological structure.By employing the natural separable expression of the reflection/transmission operator, the method of seismic wave modeling with the one-way wave equation is developed to model the primary reflected waves, it is adapt to a certain extent heterogeneous media and confirms the accuracy of AVA of the reflections when the incident angle is less than 45'. The computational efficiency of the scheme is greatly high.The natural separable expression of the reflection/transmission operator is also used to construct prestack elastic inversion algorithm. Being different from the AVO analysis and inversion in which the angle gathers formed during the prstack migration are used, the proposed algorithm construct a linear equations during the prestack migration by the separable expression of the reflection/transmission operator. The unknowns of the linear equations are related to the elasticity of the medium, so the resolutions of them provided the elastic information of the medium.The proposed method of inversion is the same as AVO inversion in , the difference between them is only the method processing the amplitude via the incident angle and computational domain.
Resumo:
2010
Resumo:
2010
Resumo:
AGROSCRE é um programa computacional elaborado em linguagem Quick Basic 4.5, para facilitar a avaliação de princípios ativos de agrotóxicos pelos métodos de GOSS, pelo índice de GUS e por critérios da EPA ? Environmental Protection Agency . O método de GOSS indica o potencial de transporte de princípio ativo associado a sedimento ou dissolvido em água e o método de GUS o potencial de lixiviação dos princípios ativos (p.a.). Os critérios da EPA também avaliam essas tendências de transporte. Para a avaliação de um mesmo princípio ativo pelos 3 modelos são necessárias as seguintes informações dos p.a.: constante de adsorção ao carbono orgânico (Koc), meia vida no solo (t½ solo) , meia vida em água (t½ água), solubilidade em água e constante de Henry (H); sendo que os dados mínimos para rodar pelo menos um dos modelos são Koc e t½ solo. O programa roda em arquivo executável em qualquer computador tipo PC, em ambiente amigável ao usuário.
Resumo:
Program design is an area of programming that can benefit significantly from machine-mediated assistance. A proposed tool, called the Design Apprentice (DA), can assist a programmer in the detailed design of programs. The DA supports software reuse through a library of commonly-used algorithmic fragments, or cliches, that codifies standard programming. The cliche library enables the programmer to describe the design of a program concisely. The DA can detect some kinds of inconsistencies and incompleteness in program descriptions. It automates detailed design by automatically selecting appropriate algorithms and data structures. It supports the evolution of program designs by keeping explicit dependencies between the design decisions made. These capabilities of the DA are underlaid bya model of programming, called programming by successive elaboration, which mimics the way programmers interact. Programming by successive elaboration is characterized by the use of breadth-first exposition of layered program descriptions and the successive modifications of descriptions. A scenario is presented to illustrate the concept of the DA. Technques for automating the detailed design process are described. A framework is given in which designs are incrementally augmented and modified by a succession of design steps. A library of cliches and a suite of design steps needed to support the scenario are presented.
Resumo:
The future of the software industry is today being shaped in the courtroom. Most discussions of intellectual property to date, however, have been frames as debates about how the existing law --- promulgated long before the computer revolution --- should be applied to software. This memo is a transcript of a panel discussion on what forms of legal protection should apply to software to best serve both the industry and society in general. After addressing that question we can consider what laws would bring this about.
Resumo:
The dream of pervasive computing is slowly becoming a reality. A number of projects around the world are constantly contributing ideas and solutions that are bound to change the way we interact with our environments and with one another. An essential component of the future is a software infrastructure that is capable of supporting interactions on scales ranging from a single physical space to intercontinental collaborations. Such infrastructure must help applications adapt to very diverse environments and must protect people's privacy and respect their personal preferences. In this paper we indicate a number of limitations present in the software infrastructures proposed so far (including our previous work). We then describe the framework for building an infrastructure that satisfies the abovementioned criteria. This framework hinges on the concepts of delegation, arbitration and high-level service discovery. Components of our own implementation of such an infrastructure are presented.
Resumo:
This thesis presents SodaBot, a general-purpose software agent user-environment and construction system. Its primary component is the basic software agent --- a computational framework for building agents which is essentially an agent operating system. We also present a new language for programming the basic software agent whose primitives are designed around human-level descriptions of agent activity. Via this programming language, users can easily implement a wide-range of typical software agent applications, e.g. personal on-line assistants and meeting scheduling agents. The SodaBot system has been implemented and tested, and its description comprises the bulk of this thesis.
Resumo:
Software bugs are violated specifications. Debugging is the process that culminates in repairing a program so that it satisfies its specification. An important part of debugging is localization, whereby the smallest region of the program that manifests the bug is found. The Debugging Assistant (DEBUSSI) localizes bugs by reasoning about logical dependencies. DEBUSSI manipulates the assumptions that underlie a bug manifestation, eventually localizing the bug to one particular assumption. At the same time, DEBUSSI acquires specification information, thereby extending its understanding of the buggy program. The techniques used for debugging fully implemented code are also appropriate for validating partial designs.
Resumo:
Com o intuito de disponibilizar um banco de dados de valores de potencial eletrostático para todas as estruturas de proteínas depositadas no PDB, foi utilizado o programa GRASP (Graphical Representation and Analysis of Structural Properties) (Nicholls et al., 1991) para geração deste banco de dados.
Resumo:
Princípios do processo de software leve. Pouca burocracia e adaptação às características dos projetos. Diretrizes básicas de gerência de projetos e de configuaração. Gerência de projeto. Gerência de configuração. Definição das diretrizes básicas e do processo de auditoria. Disseminação de uma linguagem de definição de representações de software. Uso de ferramentas de domínio público. Teste de frequente e cedo. Ações para implantação do processo de software leve. Definição das diretrizes básicas e de auditoria do processo. Identificação das boas praticas da Embrapa Informática Agropecuária. Disseminação do processo de software leve. Trabalhos relacionados.
Resumo:
v. 1. Aspectos de qualidade de produto de software na Embrapa. Visão geral de qualidade. Qualidade de software. Certificação de qualidade de produto de software. NBR 13596 - modelo de qualidade: características e subcaracterísticas. NBR 12119 - pacotes de software - teste e requisitos de qualidade. Qualidade na Embrapa.
Resumo:
A agencia de informação Embrapa disponibiliza na internet informação qualificada e organizada e, muitas vezes, também aquelas geradas pela própria Embrapa. As soluções de software esposta neste trabalho são dirigidas ao gerenciamento dessas informações, que são armazenadas em base de dados centralizada e atualizada via internet por aplicativos deste sistema. O objetivo de apresentar essas soluções é contribuir para o desenvolvimento de sistemas com orientação metodológica similar. Este sistema teve como principal identificação de requisitos as falhas existentes na primeira versão do mesmo, que foi orientada exclusivamente para manipulação de dados formatados em XML. A nova versão traz uma arquitetura baseada nas orientações Java 2 Enterprise Editon (J2EE): modelo em camadas (orientação Model View Controler-MVC), uso de containers e sistema gerenciador de banco de dados. O resultado é um sistema mais robusto em seu todo, além das melhorias de manutenabilidade. Termos para indexação:J2EE, XML, PDOM, Model view controller- MVC, Oracle.
Resumo:
A cultura da cana-de-açúcar vem sofrendo mudanças, de âmbitos tecnológicos e sociais, profundas nesta década, procurando se adaptar às demandas de produção com alta produtividade, competitividade e respeito ao meio ambiente. Apesar de o Brasil ser o maior produtor mundial de cana-de-açúcar, ainda pratica a queima da palha do canavial para facilitar a colheita, o que gera prejuízos econômicos, sociais e ambientais. Sem essa queima (Decreto n.° 42056 do Estado de SP), a cobertura do solo pela palhada irá provocar significativas mudanças no manejo da cultura e na dinâmica do nitrogênio. Dada a complexibilidade do ciclo de nitrogênio no solo, seus vários caminhos de transformação, e as variações climáticas, é difícil a determinação do melhor manejo do nitrogênio em sistemas de cultivo, pois não há análise de solo para apoiar o agricultor no seu manejo. Modelos de Simulação que descrevem as transformações do nitrogênio do solo podem prever valores e direcionar o melhor manejo do nitrogênio, tanto do ponto de vista da produtividade da cana como da qualidade ambiental. Assim, o modelo preliminar proposto na Fase I deste estudo em Relatório Técnico 22, da Embrapa informática Agropecuária, foi, nesta Fase II do projeto, ajustado com valores para solos tropicais e reconstruído no software de Simulação STELLA, agregando-se todo o conhecimento disponível em expressões matemáticas sobre esse assunto. Procedendo-se a simulação numérica em situações usuais, geraram-se como resultados, cenários que permitiram discussões técnicas sobre o melhoria do manejo do fertilizante nitrogenado. Concluiu-se que, apesar da complexa dinâmica do nitrogênio no sistema solo-planta e das dificuldades inerentes à medida de formas disponíveis de N, o modelo ajustado apresentou-se como uma alternativa para pesquisadores, técnicos e produtores no entendimento dos processos que envolvem o nitrogênio no sistema, auxiliando na busca por soluções para o melhor manejo de fertilizantes nitrogenados à cultura da cana-de-açúcar para manutenção de produtividades adequadas.
Resumo:
Contatos interatômicos são definidos no contexto deste trabalho como as forças de atração ou de repulsão existentes entre átomos distintos.