982 resultados para discussions
Resumo:
Superconducting electron cyclotron resonance ion source with advanced design in Lanzhou (SECRAL) is an all-superconducting-magnet electron cyclotron resonance ion source (ECRIS) for the production of intense highly charged ion beams to meet the requirements of the Heavy Ion Research Facility in Lanzhou (HIRFL). To further enhance the performance of SECRAL, an aluminum chamber has been installed inside a 1.5 mm thick Ta liner used for the reduction of x-ray irradiation at the high voltage insulator. With double-frequency (18+14.5 GHz) heating and at maximum total microwave power of 2.0 kW, SECRAL has successfully produced quite a few very highly charged Xe ion beams, such as 10 e mu A of Xe37+, 1 e mu A of Xe43+, and 0.16 e mu A of Ne-like Xe44+. To further explore the capability of the SECRAL in the production of highly charged heavy metal ion beams, a first test run on bismuth has been carried out recently. The main goal is to produce an intense Bi31+ beam for HIRFL accelerator and to have a feel how well the SECRAL can do in the production of very highly charged Bi beams. During the test, though at microwave power less than 3 kW, more than 150 e mu A of Bi31+, 22 e mu A of Bi41+, and 1.5 e mu A of Bi50+ have been produced. All of these results have again demonstrated the great capability of the SECRAL source. This article will present the detailed results and brief discussions to the production of highly charged ion beams with SECRAL.
Resumo:
The direct reduction of SO2 to elemental sulfur in flue gas by the coupling of cold plasma and catalyst, being a new approach for SO2 reduction, was studied. In this process, CO2 can be disassembled to form CO, which acts as the reductant under the cold plasma. With the coupling of the cold plasma and the catalyst, sulfur dioxide was selectively reduced by CO to elemental sulfur with a byproduct of metal sulfate, e.g., FeSO4. In the present work, Fe2O3/gamma-Al2O3 was employed as the catalyst. The extent of desulfurization was more than 80%, and the selectivity of elemental sulfur is about 55%. The effects of water vapor, temperature, and the components of simulated flue gas were investigated. At the same time, the coupling of thermogravimetry and infrared method and a chemical analysis method were employed to evaluate the used catalyst. In this paper, we will focus on the discussion of the catalyst. The discussions of the detail of plasma will be introduced in another paper.
Resumo:
区域土壤侵蚀调查制图和动态分析,是国家和省区编制土壤侵蚀宏观规划的数据基础,也是一个重大前沿性研究命题。在综述国内外区域土壤侵蚀调查制图、区域土壤侵蚀因子、土壤侵蚀尺度效应和土壤侵蚀模型等研究现状的基础上,对即将开展的全国土壤侵蚀普查工作提出以下建议:土壤侵蚀普查需要充分利用我国土壤侵蚀模型研究的最新成果,采用模拟计算方法实现对土壤侵蚀强度的定量估算;调查内容应包括区域土壤侵蚀因子、土壤侵蚀类型与强度、水土保持措施、典型区域抽样调查等;对土壤侵蚀尺度效应、区域土壤侵蚀模型应用示范和区域土壤侵蚀数据库建设方法等关键技术展开攻关研究。
Resumo:
The process of deoxyribonucleio acid (DNA) sample preparation in scanning tunneling microscope (STM) and atomic force microscope (AFM) is reviewed. The main discussions are devoted to the methods, advantages or drawbacks and improvement of the DNA sample's immobilization and spreading.
Resumo:
yA review with 44 references is presented on the development of sol-gel-based biosensor. The main discussions are devoted to the process, advantages and properties of sol-gel immobilization method, sol-gel optical biosensor and amperometric biosensor, also the trend in this field is forecasted.
Resumo:
Based on the ray theory and Longuet-Higgins's linear,model of sea waves, the joint distribution of wave envelope and apparent wave number vector is established. From the joint distribution, we define a new concept, namely the outer wave number spectrum, to describe the outer characteristics of ocean waves. The analytical form of the outer wave number spectrum, the probability distributions of the apparent wave number vector and its components are then derived. The outer wave number spectrum is compared with the inner wave number spectrum for the average status of wind-wave development corresponding to a peakness factor P = 3. Discussions on the similarity and difference between the outer wave number spectrum and inner one are also presented in the paper. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
微制作机器人技术是MEMS技术的一个重要分支,也是当前机器人研究领域的一个热点。本文分析了微操作机器人集成系统的特点,并针对微制作机器人系统研制中涉及的一些关键技术,如驱动、定位、检测和控制等技术进行了论述。
Resumo:
文中重点讨论了系统实现过程中,任务分解与行走命令下达,时序分配与同步,路标定位与行走误差修正.动态障碍感知,测定与响应和特别情况紧急处理等难题的解决策略及遇到的问题.
Resumo:
Based on the fractal theories, contractive mapping principles as well as the fixed point theory, by means of affine transform, this dissertation develops a novel Explicit Fractal Interpolation Function(EFIF)which can be used to reconstruct the seismic data with high fidelity and precision. Spatial trace interpolation is one of the important issues in seismic data processing. Under the ideal circumstances, seismic data should be sampled with a uniform spatial coverage. However, practical constraints such as the complex surface conditions indicate that the sampling density may be sparse or for other reasons some traces may be lost. The wide spacing between receivers can result in sparse sampling along traverse lines, thus result in a spatial aliasing of short-wavelength features. Hence, the method of interpolation is of very importance. It not only needs to make the amplitude information obvious but the phase information, especially that of the point that the phase changes acutely. Many people put forward several interpolation methods, yet this dissertation focuses attention on a special class of fractal interpolation function, referred to as explicit fractal interpolation function to improve the accuracy of the interpolation reconstruction and to make the local information obvious. The traditional fractal interpolation method mainly based on the randomly Fractional Brown Motion (FBM) model, furthermore, the vertical scaling factor which plays a critical role in the implementation of fractal interpolation is assigned the same value during the whole interpolating process, so it can not make the local information obvious. In addition, the maximal defect of the traditional fractal interpolation method is that it cannot obtain the function values on each interpolating nodes, thereby it cannot analyze the node error quantitatively and cannot evaluate the feasibility of this method. Detailed discussions about the applications of fractal interpolation in seismology have not been given by the pioneers, let alone the interpolating processing of the single trace seismogram. On the basis of the previous work and fractal theory this dissertation discusses the fractal interpolation thoroughly and the stability of this special kind of interpolating function is discussed, at the same time the explicit presentation of the vertical scaling factor which controls the precision of the interpolation has been proposed. This novel method develops the traditional fractal interpolation method and converts the fractal interpolation with random algorithms into the interpolation with determined algorithms. The data structure of binary tree method has been applied during the process of interpolation, and it avoids the process of iteration that is inevitable in traditional fractal interpolation and improves the computation efficiency. To illustrate the validity of the novel method, this dissertation develops several theoretical models and synthesizes the common shot gathers and seismograms and reconstructs the traces that were erased from the initial section using the explicit fractal interpolation method. In order to compare the differences between the theoretical traces that were erased in the initial section and the resulting traces after reconstruction on waveform and amplitudes quantitatively, each missing traces are reconstructed and the residuals are analyzed. The numerical experiments demonstrate that the novel fractal interpolation method is not only applicable to reconstruct the seismograms with small offset but to the seismograms with large offset. The seismograms reconstructed by explicit fractal interpolation method resemble the original ones well. The waveform of the missing traces could be estimated very well and also the amplitudes of the interpolated traces are a good approximation of the original ones. The high precision and computational efficiency of the explicit fractal interpolation make it a useful tool to reconstruct the seismic data; it can not only make the local information obvious but preserve the overall characteristics of the object investigated. To illustrate the influence of the explicit fractal interpolation method to the accuracy of the imaging of the structure in the earth’s interior, this dissertation applies the method mentioned above to the reverse-time migration. The imaging sections obtained by using the fractal interpolated reflected data resemble the original ones very well. The numerical experiments demonstrate that even with the sparse sampling we can still obtain the high accurate imaging of the earth’s interior’s structure by means of the explicit fractal interpolation method. So we can obtain the imaging results of the earth’s interior with fine quality by using relatively small number of seismic stations. With the fractal interpolation method we will improve the efficiency and the accuracy of the reverse-time migration under economic conditions. To verify the application effect to real data of the method presented in this paper, we tested the method by using the real data provided by the Broadband Seismic Array Laboratory, IGGCAS. The results demonstrate that the accuracy of explicit fractal interpolation is still very high even with the real data with large epicenter and large offset. The amplitudes and the phase of the reconstructed station data resemble the original ones that were erased in the initial section very well. Altogether, the novel fractal interpolation function provides a new and useful tool to reconstruct the seismic data with high precision and efficiency, and presents an alternative to image the deep structure of the earth accurately.
Resumo:
Stochastic reservoir modeling is a technique used in reservoir describing. Through this technique, multiple data sources with different scales can be integrated into the reservoir model and its uncertainty can be conveyed to researchers and supervisors. Stochastic reservoir modeling, for its digital models, its changeable scales, its honoring known information and data and its conveying uncertainty in models, provides a mathematical framework or platform for researchers to integrate multiple data sources and information with different scales into their prediction models. As a fresher method, stochastic reservoir modeling is on the upswing. Based on related works, this paper, starting with Markov property in reservoir, illustrates how to constitute spatial models for catalogued variables and continuum variables by use of Markov random fields. In order to explore reservoir properties, researchers should study the properties of rocks embedded in reservoirs. Apart from methods used in laboratories, geophysical means and subsequent interpretations may be the main sources for information and data used in petroleum exploration and exploitation. How to build a model for flow simulations based on incomplete information is to predict the spatial distributions of different reservoir variables. Considering data source, digital extent and methods, reservoir modeling can be catalogued into four sorts: reservoir sedimentology based method, reservoir seismic prediction, kriging and stochastic reservoir modeling. The application of Markov chain models in the analogue of sedimentary strata is introduced in the third of the paper. The concept of Markov chain model, N-step transition probability matrix, stationary distribution, the estimation of transition probability matrix, the testing of Markov property, 2 means for organizing sections-method based on equal intervals and based on rock facies, embedded Markov matrix, semi-Markov chain model, hidden Markov chain model, etc, are presented in this part. Based on 1-D Markov chain model, conditional 1-D Markov chain model is discussed in the fourth part. By extending 1-D Markov chain model to 2-D, 3-D situations, conditional 2-D, 3-D Markov chain models are presented. This part also discusses the estimation of vertical transition probability, lateral transition probability and the initialization of the top boundary. Corresponding digital models are used to specify, or testify related discussions. The fifth part, based on the fourth part and the application of MRF in image analysis, discusses MRF based method to simulate the spatial distribution of catalogued reservoir variables. In the part, the probability of a special catalogued variable mass, the definition of energy function for catalogued variable mass as a Markov random field, Strauss model, estimation of components in energy function are presented. Corresponding digital models are used to specify, or testify, related discussions. As for the simulation of the spatial distribution of continuum reservoir variables, the sixth part mainly explores 2 methods. The first is pure GMRF based method. Related contents include GMRF model and its neighborhood, parameters estimation, and MCMC iteration method. A digital example illustrates the corresponding method. The second is two-stage models method. Based on the results of catalogued variables distribution simulation, this method, taking GMRF as the prior distribution for continuum variables, taking the relationship between catalogued variables such as rock facies, continuum variables such as porosity, permeability, fluid saturation, can bring a series of stochastic images for the spatial distribution of continuum variables. Integrating multiple data sources into the reservoir model is one of the merits of stochastic reservoir modeling. After discussing how to model spatial distributions of catalogued reservoir variables, continuum reservoir variables, the paper explores how to combine conceptual depositional models, well logs, cores, seismic attributes production history.
Resumo:
The decision making of customers has been a great concern in the field of customer research. Although China has entered the era of brand consumption and development, due to the different understanding of the regarded attributes between companies and customers, the phenomenon of “The awarded products don’t sell well, but the products which sell well can’t get the award.” appears. At the same time there is little research on the relationship between the brand and the customers has been conducted in China now. Traditional research on customer psychology employ questionnaires, depth interview and group discussions as the major methods. In cognitive psychology, the limitation of explicit memory has been revealed by implicit memory; moreover, unconscious cognition and implicit memory can also influence customers' remark of the brand. Therefore, the traditional methods are not accurate enough. Reaction time is an effective way to reveal testing equality, and it can also reveal implicit cognition. Based on the researches intends to investigate the validity of attention attributes in the method of reaction time by questionnaires and time reaction testing of 360 customers in 3 cities, which may, probably, overcomes the limitation of the traditional research methods. The 352 valid samples were analyzed by SPSS. The results showed there was no distinct corresponding relationship between the product attributes and reaction time. The different key attributes from questionnaire importance rating and the shortest reaction time standards were used to regressively analyze the results of customers’ overall rating (such as overall satisfaction,objective quality, recommend intention).The results indicated that the coefficiency of regression of the special attributes chosen from reaction time to overall rating was distinct, while the coefficiency of the special attributes chosen from importance rating to overall rating was not. The main conclusions are: 1. Regarded attributes can be obtained by the reaction time of brand performance rating. 2. Regarded attributes obtained by the reaction time of brand performance rating are more accurate than those by importance rating questionnaires. 3. The brand’s core attributes should includes regarded attributes during the decision making process.
Resumo:
The future of the software industry is today being shaped in the courtroom. Most discussions of intellectual property to date, however, have been frames as debates about how the existing law --- promulgated long before the computer revolution --- should be applied to software. This memo is a transcript of a panel discussion on what forms of legal protection should apply to software to best serve both the industry and society in general. After addressing that question we can consider what laws would bring this about.
Resumo:
These proceedings summarize the results of the First PHANToM User's Group Workshop held September 27-30, 1996 MIT. The goal of the workshop was to bring together a group of active users of the PHANToM Haptic Interface to discuss the scientific and engineering challenges involved in bringing haptics into widespread use, and to explore the future possibilities of this exciting technology. With over 50 attendees and 25 presentations the workshop provided the first large forum for users of a common haptic interface to share results and engage in collaborative discussions. Short papers from the presenters are contained herein and address the following topics: Research Effort Overviews, Displays and Effects, Applications in Teleoperation and Training, Tools for Simulated Worlds and, Data Visualization.
Resumo:
Much has been published in recent years about the desirable nature of facilitated interactions in on-line discussions with educational purposes. However little has been reported about the roles which tutors actually adopt in real life learning contexts, how these range between ‘tutoring, ‘managing’ and ‘facilitating’, and what the distinctions between these three roles may be. In this paper choices of priorities in e-moderation, which were made in three naturalistic (real life) case studies by three higher education practitioners, are identified and discussed. These contrasting approaches were captured and analysed using grounded theory principles. The paper also discusses these occasions when the facilitation was less effective than might have been desired. It finally summarises the potential of various approaches within e-moderation – and some of the attendant risks. The finding is that principles and practices developed for face-to-face support of student-directed learning were found equally applicable in e-moderated online group work, despite several significant differences between the two types of setting. Keywords: higher education, e-learning, e-moderation, asynchronous discussions, learning outcomes, grounded theory