985 resultados para Computers.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

在基于不同总线标准的个人计算机、工程工作站以及一些工业控制机间建立高速的并行通信通道,将它们构成多机系统,能够以较小的代价获得增强的系统特性.本文引入了一种在系统总线间提供并行数据通道的异种总线互连底板——总线桥的概念,讨论了它的结构和工作原理,提出了其实现方案,并着重描述了为总线桥定义的双向并行通讯协议 BBP(Bus Bridge Prtocol)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

本文针对我国计算机绘制机械图领域内的现状,提出了一种易于掌握的机械图描述语言,以及一个相应的功能较强的绘图系统。该系统便于图形输入,并具有图形的平移、旋转、映射等功能,和较强的剖面域的处理能力。系统带有一个标注专用的笔写式字符库(包括汉字和专用符号)。该系统用 FORTRAN 语言写成,便于向大、中型计算机和存储容量较大的微型机移植。

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numerical modeling of groundwater is very important for understanding groundwater flow and solving hydrogeological problem. Today, groundwater studies require massive model cells and high calculation accuracy, which are beyond single-CPU computer’s capabilities. With the development of high performance parallel computing technologies, application of parallel computing method on numerical modeling of groundwater flow becomes necessary and important. Using parallel computing can improve the ability to resolve various hydro-geological and environmental problems. In this study, parallel computing method on two main types of modern parallel computer architecture, shared memory parallel systems and distributed shared memory parallel systems, are discussed. OpenMP and MPI (PETSc) are both used to parallelize the most widely used groundwater simulator, MODFLOW. Two parallel solvers, P-PCG and P-MODFLOW, were developed for MODFLOW. The parallelized MODFLOW was used to simulate regional groundwater flow in Beishan, Gansu Province, which is a potential high-level radioactive waste geological disposal area in China. 1. The OpenMP programming paradigm was used to parallelize the PCG (preconditioned conjugate-gradient method) solver, which is one of the main solver for MODFLOW. The parallel PCG solver, P-PCG, is verified using an 8-processor computer. Both the impact of compilers and different model domain sizes were considered in the numerical experiments. The largest test model has 1000 columns, 1000 rows and 1000 layers. Based on the timing results, execution times using the P-PCG solver are typically about 1.40 to 5.31 times faster than those using the serial one. In addition, the simulation results are the exact same as the original PCG solver, because the majority of serial codes were not changed. It is worth noting that this parallelizing approach reduces cost in terms of software maintenance because only a single source PCG solver code needs to be maintained in the MODFLOW source tree. 2. P-MODFLOW, a domain decomposition–based model implemented in a parallel computing environment is developed, which allows efficient simulation of a regional-scale groundwater flow. The basic approach partitions a large model domain into any number of sub-domains. Parallel processors are used to solve the model equations within each sub-domain. The use of domain decomposition method to achieve the MODFLOW program distributed shared memory parallel computing system will process the application of MODFLOW be extended to the fleet of the most popular systems, so that a large-scale simulation could take full advantage of hundreds or even thousands parallel processors. P-MODFLOW has a good parallel performance, with the maximum speedup of 18.32 (14 processors). Super linear speedups have been achieved in the parallel tests, indicating the efficiency and scalability of the code. Parallel program design, load balancing and full use of the PETSc were considered to achieve a highly efficient parallel program. 3. The characterization of regional ground water flow system is very important for high-level radioactive waste geological disposal. The Beishan area, located in northwestern Gansu Province, China, is selected as a potential site for disposal repository. The area includes about 80000 km2 and has complicated hydrogeological conditions, which greatly increase the computational effort of regional ground water flow models. In order to reduce computing time, parallel computing scheme was applied to regional ground water flow modeling. Models with over 10 million cells were used to simulate how the faults and different recharge conditions impact regional ground water flow pattern. The results of this study provide regional ground water flow information for the site characterization of the potential high-level radioactive waste disposal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many observations show that seismic anisotropy is very common in the crust and upper mantle of the Earth. Seismic anisotropy can provide some clue about the changing and transporting process inside the earth. in recent years, abundant earthquake travel time data are accumulated, computers become more powerful, and these make the inversion of earthquake travel time data practical. In this thesis we studied the theory of elastic wave in anisotropic media, some formule for travel time inversion were derived. We present an iterative procedure to determine 21 elastic parameters from qP wave travel times. No a priori assumptions about heterogeneity and anisotropy of the model are made. The procedure is suitable for the case when we know nothing about the symmetry of anisotropy of the media, as well as for the case of earthquake travel time inversion which may contain various symmetry of anisotropy. The procedure is tested with a synthetic multiple-source offset VSP experiment. The results proved that the formulae are correct, and the procedure is practical. The results and the related theory indicate that the anisotropic inversion needs more rays than isotropic case. For a 2-D weak anisotropic (WA) medium, we need at least 5 rays in different directions to retrieve the elastic parameters on one grid point, and for a 3-D WA medium we need at least 15 rays in different directions to retrieve the elastic parameters on one grid point. The results also indicate that the starting background velocity has no influence on the final results, at least for the model we specified. Our results also show that insufficient illumination coverage will slow down the convergence rate, and make the results more sensitive to noise. We apply the procedure to a set of field travel time data. The data is from an artificial seismic observation. This observation is for locating micro-seismic events around a tunnel, its purpose is to find out if the digging process and the stress condition around the tunnel can generate micro-cracks. The size of this area is around 100m. The anisotropy derived from qP travel times is the same as the anisotropy showed by apparent velocities, and is also consistent with the anisotropy derived from S-wave splitting phenomena.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Under the auspices of the 'knowledge-Innovation Program' of CAS, Institute of Geology and Geophysics has established the Broadband Seismic Laboratory. A new kind of 24-bit high-resolution seismograph DAS24-3B has been designed and manufactured in an effort of developing China's own technology of seismic array. Since these instruments will primarily be used in field operation, there is a need to optimize the system software of data acquisition system (DAS) to enhance its stability, compatibility and maintenance. The design ideas of the system software of DAS24-3B are partly learned from the advanced DAS 72A-08. In this system there are two exclusive communication programs DNAPI-COM1 and DNAPI-LPT1, which are suitable for all standard industrial computers with ECP parallel port and serial port. By these exclusive parallel and serial communication interface the system software is split into three parts, acquisition program, user's control program and graphical display program, which can function well in separate units and can run correctly in whole. The three parts of DAS24-3B's system software possess different functions and advantages. The function of acquisition program is to control the process of seismic data acquisition. DAS24-3B system reduced its power and harddisk read-write disturbance by using the extended memory attached to its CPU, which functions as enlarging the data buffer of system and lessening the times of harddisk read-write operations. Since GPS receiver of DAS is strongly sensitive to the around environment and has the possibility of signal loss the acquisition program has been designed with the ability to automatically trail the GPS locked time. The function of user's controlling program is to configure the system's work environment, to inform the user's commands to DAS, to trail the status of DAS in real-time. The function of graphical display program is to illustrate data in figures, to convert data file into some common formatted file, to split data file in parts and combine data files into one. Both user's control program and graphical display program are API (Application Programming Interface) in window 95/98 system. Both possess the features of clearness and friendship by use of all kind of window controls, which are composed by menu, toolbar, statusbar, dialogue box, message box, edit box, scrollbar, time control, button and so on. Two programs of systemic exception handles are provided to treat the trouble in field. The DAS24-3B DAS has been designed to be easier to use-better ability, more stable and simpler. It has been tested in field and base station and has been proved more suitable for field operation of seismic array than other native instruments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper belong to national "973" technological project undertaken by Shengli Oilfield. Work area is composed of turbidite reservoir of S212 and delta reservoir of S283 of Sheng2 unit in Shengtuo Oilfield that has a 36 years water injection development history. Change of the macroscopic, microscopic and filterational parameters and its mechanism have been studied in the 4 water-cut stages i.e. the primary, moderate, high and supper-high stage by using multi-disciplinary theories and approaches, computer techniques and physical simulation comprehensively. Dynamic reservoir models to different water-cut stages have been established. The study of forming mechanism and distribution of residual oil revealed the main types and spatial distribution of residual oil in different water-cut stages and the distribution mode has also been built up. Macroscopic, microscopic and filterational parameters selecting principle, optimizing and selecting standard, matching standard and laws and related database of various dynamic parameters in different water-cut stages have been established, which laid good basis for revealing reservoir macroscopic, microscopic and filterational parameters' dynamic change and residual oil distribution. The study indicated that in general, the macroscopic, microscopic and filterational parameters will slowly increase and become better in both shallow turbidite and delta reservoirs with the increasing of water cut, but different reservoirs have their own characteristics and change laws. Parameters of I~2 unit, whose petrophysical properties are better, increase more quickly than 8~3, whose petrophysical properties are more unfavorable. The changes was relatively quickly in high water-cut stage, while relatively slowly from primary to moderate and from high to supper-high water-cut stage. This paper firstly put forward that reservoir macroscopic, microscopic and filterational parameters are controlled by dynamic geological function of reservoir fluid, which is considered the major reason of reservoir parameters' dynamic changes and residual oil formation and distribution during reservoir development. Physical simulation of filterational parameters verified that forming mechanism and distribution of residual oil in different water-cut stages are also controlled by dynamic geological function of reservoir fluid. The idea of fluid geological function during reservoir development developed the theory of development geology, and has important practical values. This paper firstly constructed dynamic geological and mathematical models and five modes of residual oil distribution in Shengtuo Oilfield, and achieved four-dimensional forecast of residual oil distribution in different watercut stages. Dynamic changes and mechanism of macroscopic, microscopic and fliterational parameters of reservoir and their change process have been revealed. Forecast of residual oil distribution has been achieved by computers. This paper established the related theories, approaches and techniques for residual oil study, characterization and in different water-cut stages, and realized dynamic forecast of residual oil. It gained remarkable economic benefit and social effect in guiding field development. These theories and techniques had important meaningfulness for residual oil prediction in the terrestrial faulted basins not only in Shengli Oilfield but also in the east of China. Furthermore, this study has developed the theory of development geology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scientists are faced with a dilemma: either they can write abstract programs that express their understanding of a problem, but which do not execute efficiently; or they can write programs that computers can execute efficiently, but which are difficult to write and difficult to understand. We have developed a compiler that uses partial evaluation and scheduling techniques to provide a solution to this dilemma.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Early and intermediate vision algorithms, such as smoothing and discontinuity detection, are often implemented on general-purpose serial, and more recently, parallel computers. Special-purpose hardware implementations of low-level vision algorithms may be needed to achieve real-time processing. This memo reviews and analyzes some hardware implementations of low-level vision algorithms. Two types of hardware implementations are considered: the digital signal processing chips of Ruetz (and Broderson) and the analog VLSI circuits of Carver Mead. The advantages and disadvantages of these two approaches for producing a general, real-time vision system are considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As the size of digital systems increases, the mean time between single component failures diminishes. To avoid component related failures, large computers must be fault-tolerant. In this paper, we focus on methods for achieving a high degree of fault-tolerance in multistage routing networks. We describe a multipath scheme for providing end-to-end fault-tolerance on large networks. The scheme improves routing performance while keeping network latency low. We also describe the novel routing component, RN1, which implements this scheme, showing how it can be the basic building block for fault-tolerant multistage routing networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computers and Thought are the two categories that together define Artificial Intelligence as a discipline. It is generally accepted that work in Artificial Intelligence over the last thirty years has had a strong influence on aspects of computer architectures. In this paper we also make the converse claim; that the state of computer architecture has been a strong influence on our models of thought. The Von Neumann model of computation has lead Artificial Intelligence in particular directions. Intelligence in biological systems is completely different. Recent work in behavior-based Artificial Intelligenge has produced new models of intelligence that are much closer in spirit to biological systems. The non-Von Neumann computational models they use share many characteristics with biological computation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of increasingly sophisticated and powerful computers in the last few decades has frequently stimulated comparisons between them and the human brain. Such comparisons will become more earnest as computers are applied more and more to tasks formerly associated with essentially human activities and capabilities. The expectation of a coming generation of "intelligent" computers and robots with sensory, motor and even "intellectual" skills comparable in quality to (and quantitatively surpassing) our own is becoming more widespread and is, I believe, leading to a new and potentially productive analytical science of "information processing". In no field has this new approach been so precisely formulated and so thoroughly exemplified as in the field of vision. As the dominant sensory modality of man, vision is one of the major keys to our mastery of the environment, to our understanding and control of the objects which surround us. If we wish to created robots capable of performing complex manipulative tasks in a changing environment, we must surely endow them with (among other things) adequate visual powers. How can we set about designing such flexible and adaptive robots? In designing them, can we make use of our rapidly growing knowledge of the human brain, and if so, how at the same time, can our experiences in designing artificial vision systems help us to understand how the brain analyzes visual information?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This report addresses the problem of fault tolerance to system failures for database systems that are to run on highly concurrent computers. It assumes that, in general, an application may have a wide distribution in the lifetimes of its transactions. Logging remains the method of choice for ensuring fault tolerance. Generational garbage collection techniques manage the limited disk space reserved for log information; this technique does not require periodic checkpoints and is well suited for applications with a broad range of transaction lifetimes. An arbitrarily large collection of parallel log streams provide the necessary disk bandwidth.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nonlinear multivariate statistical techniques on fast computers offer the potential to capture more of the dynamics of the high dimensional, noisy systems underlying financial markets than traditional models, while making fewer restrictive assumptions. This thesis presents a collection of practical techniques to address important estimation and confidence issues for Radial Basis Function networks arising from such a data driven approach, including efficient methods for parameter estimation and pruning, a pointwise prediction error estimator, and a methodology for controlling the "data mining'' problem. Novel applications in the finance area are described, including customized, adaptive option pricing and stock price prediction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents methods for implementing robust hexpod locomotion on an autonomous robot with many sensors and actuators. The controller is based on the Subsumption Architecture and is fully distributed over approximately 1500 simple, concurrent processes. The robot, Hannibal, weighs approximately 6 pounds and is equipped with over 100 physical sensors, 19 degrees of freedom, and 8 on board computers. We investigate the following topics in depth: distributed control of a complex robot, insect-inspired locomotion control for gait generation and rough terrain mobility, and fault tolerance. The controller was implemented, debugged, and tested on Hannibal. Through a series of experiments, we examined Hannibal's gait generation, rough terrain locomotion, and fault tolerance performance. These results demonstrate that Hannibal exhibits robust, flexible, real-time locomotion over a variety of terrain and tolerates a multitude of hardware failures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis examines a tactile sensor and a thermal sensor for use with the Utah-MIT dexterous four fingered hand. Sensory feedback is critical or full utilization of its advanced manipulatory capabilities. The hand itself provides tendon tensions and joint angles information. However, planned control algorithms require more information than these sources can provide. The tactile sensor utilizes capacitive transduction with a novel design based entirely on silicone elastomers. It provides an 8 x 8 array of force cells with 1.9 mm center-to-center spacing. A pressure resolution of 8 significant bits is available over a 0 to 200 grams per square mm range. The thermal sensor measures a material's heat conductivity by radiating heat into an object and measuring the resulting temperature variations. This sensor has a 4 x 4 array of temperature cells with 3.5 mm center-to-center spacing. Experiments show that the thermal sensor can discriminate among material by detecting differences in their thermal conduction properties. Both sensors meet the stringent mounting requirements posed by the Utah-MIT hand. Combining them together to form a sensor with both tactile and thermal capabilities will ultimately be possible. The computational requirements for controlling a sensor equipped dexterous hand are severe. Conventional single processor computers do not provide adequate performance. To overcome these difficulties, a computational architecture based on interconnecting high performance microcomputers and a set of software primitives tailored for sensor driven control has been proposed. The system has been implemented and tested on the Utah-MIT hand. The hand, equipped with tactile and thermal sensors and controlled by its computational architecture, is one of the most advanced robotic manipulatory devices available worldwide. Other ongoing projects will exploit these tools and allow the hand to perform tasks that exceed the capabilities of current generation robots.