982 resultados para Validated Interval Software
Resumo:
为了实现轴类零件的修复,搭建基于激光熔覆技术的绿色再制造系统,该系统由功率为6kW的CO2激光器、四轴工作台、送粉机构、数控系统等硬件设备和激光绿色再制造系统驱动软件组成。以轴类为典型零件、Ni60A合金粉末为熔覆材料,对激光绿色再制造工艺技术进行研究。通过研究分析激光功率、熔覆速度、送粉量、熔覆间距等主要参数对熔覆层高度、宽度和熔覆层质量等的影响情况,获得了轴类零件激光修复的最佳工艺参数组合,实现基于激光熔覆的绿色再制造技术在生产实践中的应用。
Resumo:
Seismic While Drilling (SWD) is a new wellbore seismic technique. It uses the vibrations produced by a drill-bit while drilling as a downhole seismic energy source. The continuous signals generated by the drill bit are recorded by a pilot sensor attached to the top of the drill-string. Seismic wave receivers positioned in the earth near its surface receive the seismic waves both directly and reflection from the geologic formations. The pilot signal is cross-correlated with the receiver signals to compute travel-times of the arrivals (direct arrival and reflected arrival) and attenuate incoherent noise. No downhole intrusmentation is required to obtain the data and the data recording does not interfere with the drilling process. These characteristics offer a method by which borehole seismic data can be acquired, processed, and interpreted while drilling. As a Measure-While-Drill technique. SWD provides real-time seismic data for use at the well site . This can aid the engineer or driller by indicating the position of the drill-bit and providing a look at reflecting horizons yet to be encountered by the drill-bit. Furthermore, the ease with which surface receivers can be deployed makes multi-offset VSP economically feasible. First, this paper is theoretically studying drill-bit wavefield, interaction mode between drill-bit and formation below drill-bit , the new technique of modern signal process was applied to seismic data, the seismic body wave radiation pattern of a working roller-cone drill-bit can be characterized by theoretical modeling. Then , a systematical analysis about the drill-bit wave was done, time-distance equation of seismic wave traveling was established, the process of seismic while drilling was simulated using the computer software adaptive modeling of SWD was done . In order to spread this technique, I have made trial SWD modeling during drilling. the paper sketches out the procedure for trial SWD modeling during drilling , the involved instruments and their functions, and the trial effect. Subsurface condition ahead of the drill-bit can be predicted drillstring velocity was obtained by polit sensor autocorrelation. Reference decovolution, the drillstring multiples in the polit signal are removed by reference deconvolution, the crosscorrelation process enhance the signal-to-noise power ratio, lithologies. Final, SWD provides real-time seismic data for use at the well site well trajectory control exploratory well find out and preserve reservoirs. intervel velocity was computed by the traveltime The results of the interval velocity determination reflects the pore-pressure present in the subsurface units ahead of the drill-bit. the presences of fractures in subsurface formation was detected by shear wave. et al.
Resumo:
This dissertation that includes most of the P. PH.D research work during 2001~2002 covers the large-scale distribution of continental earthquakes in mainland China, the mechanism and statistic features of grouped strong earthquakes related to the tidal triggering, some results in earthquake prediction with correlativity analysis methods, and the flushes from the two strong continental earthquakes in South Asia in 2001. Mainland China is the only continental sub-plate that is compressed by collision boundaries at its two sides, within which earthquakes are dispersive and distributed as seismic belts with different widths. The control capability of the continental block boundaries on the strong earthquakes and seismic hazards is calculated and analyzed in this dissertation. By mapping the distribution of the 31282 ML:3s2,0 earthquakes, I found that the depth of continental earthquakes depend on the tectonic zonings. The events on the boundaries of relatively integrated blocks are deep and those on the new-developed ruptures are shallow. The average depth of earthquakes in the West of China is about 5km deeper than that in the east. The western and southwestern brim of Tarim Basin generated the deepest earthquakes in mainland China. The statistic results from correlation between the grouped M7 earthquakes and the tidal stress show that the strong events were modulated by tidal stress in active periods. Taking Taiwan area as an example, the dependence of moderate events on the moon phase angles (£>) is analyzed, which shows that the number of the earthquakes in Taiwan when D is 50° ,50° +90° and 50° +180° is more than 2 times of standard deviation over the average frequency at each degree, corresponding to the 4th, 12th and 19th solar day after the new moon. The probability of earthquake attack to the densely populated Taiwan island on the 4th solar day is about 4 times of that on other solar days. On the practice of earthquake prediction, I calculated and analyzed the temporal correlation of the earthquakes in Xinjinag area, Qinghai-Tibet area, west Yunnan area, North China area and those in their adjacent areas, and predicted at the end of 2000 that it would be a special time interval from 2001 to 2003, within which moderate to strong earthquakes would be more active in the west of China. What happened in 2001 partly validated the prediction. Within 10 months, there were 2 great continental earthquakes in south Asia, i.e., the M7.8 event in India on Jan 26 and M8.1 event in China on Nov. 14, 2001, which are the largest earthquake in the past 50 years both for India and China. No records for two great earthquakes in Asia within so short time interval. We should speculate the following aspects from the two incidences: The influence of the fallacious deployment of seismic stations on the fine location and focal mechanism determination of strong earthquakes must be affronted. It is very important to introduce comparative seismology research to seismic hazard analysis and earthquake prediction research. The improvement or changes in real-time prediction of strong earthquakes with precursors is urged. Methods need to be refreshed to protect environment and historical relics in earthquake-prone areas.
Resumo:
2010
Resumo:
2010
Resumo:
AGROSCRE é um programa computacional elaborado em linguagem Quick Basic 4.5, para facilitar a avaliação de princípios ativos de agrotóxicos pelos métodos de GOSS, pelo índice de GUS e por critérios da EPA ? Environmental Protection Agency . O método de GOSS indica o potencial de transporte de princípio ativo associado a sedimento ou dissolvido em água e o método de GUS o potencial de lixiviação dos princípios ativos (p.a.). Os critérios da EPA também avaliam essas tendências de transporte. Para a avaliação de um mesmo princípio ativo pelos 3 modelos são necessárias as seguintes informações dos p.a.: constante de adsorção ao carbono orgânico (Koc), meia vida no solo (t½ solo) , meia vida em água (t½ água), solubilidade em água e constante de Henry (H); sendo que os dados mínimos para rodar pelo menos um dos modelos são Koc e t½ solo. O programa roda em arquivo executável em qualquer computador tipo PC, em ambiente amigável ao usuário.
Resumo:
Program design is an area of programming that can benefit significantly from machine-mediated assistance. A proposed tool, called the Design Apprentice (DA), can assist a programmer in the detailed design of programs. The DA supports software reuse through a library of commonly-used algorithmic fragments, or cliches, that codifies standard programming. The cliche library enables the programmer to describe the design of a program concisely. The DA can detect some kinds of inconsistencies and incompleteness in program descriptions. It automates detailed design by automatically selecting appropriate algorithms and data structures. It supports the evolution of program designs by keeping explicit dependencies between the design decisions made. These capabilities of the DA are underlaid bya model of programming, called programming by successive elaboration, which mimics the way programmers interact. Programming by successive elaboration is characterized by the use of breadth-first exposition of layered program descriptions and the successive modifications of descriptions. A scenario is presented to illustrate the concept of the DA. Technques for automating the detailed design process are described. A framework is given in which designs are incrementally augmented and modified by a succession of design steps. A library of cliches and a suite of design steps needed to support the scenario are presented.
Resumo:
The future of the software industry is today being shaped in the courtroom. Most discussions of intellectual property to date, however, have been frames as debates about how the existing law --- promulgated long before the computer revolution --- should be applied to software. This memo is a transcript of a panel discussion on what forms of legal protection should apply to software to best serve both the industry and society in general. After addressing that question we can consider what laws would bring this about.
Resumo:
This paper addresses the problem of nonlinear multivariate root finding. In an earlier paper we described a system called Newton which finds roots of systems of nonlinear equations using refinements of interval methods. The refinements are inspired by AI constraint propagation techniques. Newton is competative with continuation methods on most benchmarks and can handle a variety of cases that are infeasible for continuation methods. This paper presents three "cuts" which we believe capture the essential theoretical ideas behind the success of Newton. This paper describes the cuts in a concise and abstract manner which, we believe, makes the theoretical content of our work more apparent. Any implementation will need to adopt some heuristic control mechanism. Heuristic control of the cuts is only briefly discussed here.
Resumo:
The dream of pervasive computing is slowly becoming a reality. A number of projects around the world are constantly contributing ideas and solutions that are bound to change the way we interact with our environments and with one another. An essential component of the future is a software infrastructure that is capable of supporting interactions on scales ranging from a single physical space to intercontinental collaborations. Such infrastructure must help applications adapt to very diverse environments and must protect people's privacy and respect their personal preferences. In this paper we indicate a number of limitations present in the software infrastructures proposed so far (including our previous work). We then describe the framework for building an infrastructure that satisfies the abovementioned criteria. This framework hinges on the concepts of delegation, arbitration and high-level service discovery. Components of our own implementation of such an infrastructure are presented.
Resumo:
This thesis presents SodaBot, a general-purpose software agent user-environment and construction system. Its primary component is the basic software agent --- a computational framework for building agents which is essentially an agent operating system. We also present a new language for programming the basic software agent whose primitives are designed around human-level descriptions of agent activity. Via this programming language, users can easily implement a wide-range of typical software agent applications, e.g. personal on-line assistants and meeting scheduling agents. The SodaBot system has been implemented and tested, and its description comprises the bulk of this thesis.
Resumo:
Software bugs are violated specifications. Debugging is the process that culminates in repairing a program so that it satisfies its specification. An important part of debugging is localization, whereby the smallest region of the program that manifests the bug is found. The Debugging Assistant (DEBUSSI) localizes bugs by reasoning about logical dependencies. DEBUSSI manipulates the assumptions that underlie a bug manifestation, eventually localizing the bug to one particular assumption. At the same time, DEBUSSI acquires specification information, thereby extending its understanding of the buggy program. The techniques used for debugging fully implemented code are also appropriate for validating partial designs.
Resumo:
Com o intuito de disponibilizar um banco de dados de valores de potencial eletrostático para todas as estruturas de proteínas depositadas no PDB, foi utilizado o programa GRASP (Graphical Representation and Analysis of Structural Properties) (Nicholls et al., 1991) para geração deste banco de dados.
Resumo:
Princípios do processo de software leve. Pouca burocracia e adaptação às características dos projetos. Diretrizes básicas de gerência de projetos e de configuaração. Gerência de projeto. Gerência de configuração. Definição das diretrizes básicas e do processo de auditoria. Disseminação de uma linguagem de definição de representações de software. Uso de ferramentas de domínio público. Teste de frequente e cedo. Ações para implantação do processo de software leve. Definição das diretrizes básicas e de auditoria do processo. Identificação das boas praticas da Embrapa Informática Agropecuária. Disseminação do processo de software leve. Trabalhos relacionados.