905 resultados para Data storage
Resumo:
提出一种基于CAN总线,面向实时控制任务的高速数据采集存储方法.该方法将数据实时采集、系统实时控制、数据存储与显示分配在相互独立的处理单元中实现,处理单元之间通过独立的CAN网络进行通信.该数据采集方法解决了数据实时采集与系统实时控制之间的矛盾.利用队列技术,该方法解决了数据存储与数据实时显示之间的矛盾.该数据采集存储方法已成功应用到某大型仿真系统中.
Resumo:
介绍一种基于工业以太网通信技术及Windows平台构建的遥控潜水器控制系统.将该控制系统应用于最新研制的遥控潜水器中,其在通信能力、视频传输、控制性能、硬件的可扩展性、数据的存储与显示等多方面都比传统的控制系统具有明显的优势.在水池中进行了试验,验证了该控制系统及整个潜水器良好的运动功能和性能.
Resumo:
本文介绍了基于工业以太网的载人潜水器的信息显示与存储技术。详细地描述了该系统的软件结构及各相关模块的特点,并对系统中数据信息流的传递显示与存储做了详细的介绍。目前,该系统已在水池中进行了实验,实验效果良好。
Resumo:
In the paper through extensive study and design, the technical plan for establishing the exploration database center is made to combine imported and self developed techniques. By research and repeated experiment a modern database center has been set up with its hardware and network having advanced performance, its system well configured, its data store and management complete, and its data support being fast and direct. Through study on the theory, method and model of decision an exploration decision assistant schema is designed with one decision plan of well location decision support system being evaluated and put into action. 1. Study on the establishment of Shengli exploration database center Research is made on the hardware configuration of the database center including its workstations and all connected hardware and system. The hardware of the database center is formed by connecting workstations, microcomputer workstations, disk arrays, and those equipments used for seismic processing and interpretation. Research on the data store and management includes the analysis of the contents to be managed, data flow, data standard, data QC, data backup and restore policy, optimization of database system. A reasonable data management regulation and workflow is made and the scientific exploration data management system is created. Data load is done by working out a schedule firstly and at last 200 more projects of seismic surveys has been loaded amount to 25TB. 2. Exploration work support system and its application Seismic data processing system support has the following features, automatic extraction of seismic attributes, GIS navigation, data order, extraction of any sized data cube, pseudo huge capacity disk array, standard output exchange format etc. The prestack data can be accessed by the processing system or data can be transferred to other processing system through standard exchange format. For supporting seismic interpretation system the following features exist such as auto scan and store of interpretation result, internal data quality control etc. the interpretation system is connected directly with database center to get real time support of seismic data, formation data and well data. Comprehensive geological study support is done through intranet with the ability to query or display data graphically on the navigation system under some geological constraints. Production management support system is mainly used to collect, analyze and display production data with its core technology on the controlled data collection and creation of multiple standard forms. 3. exploration decision support system design By classification of workflow and data flow of all the exploration stages and study on decision theory and method, target of each decision step, decision model and requirement, three concept models has been formed for the Shengli exploration decision support system including the exploration distribution support system, the well location support system and production management support system. the well location decision support system has passed evaluation and been put into action. 4. Technical advance Hardware and software match with high performance for the database center. By combining parallel computer system, database server, huge capacity ATL, disk array, network and firewall together to create the first exploration database center in China with reasonable configuration, high performance and able to manage the whole data sets of exploration. Huge exploration data management technology is formed where exploration data standards and management regulations are made to guarantee data quality, safety and security. Multifunction query and support system for comprehensive exploration information support. It includes support system for geological study, seismic processing and interpretation and production management. In the system a lot of new database and computer technology have been used to provide real time information support for exploration work. Finally is the design of Shengli exploration decision support system. 5. Application and benefit Data storage has reached the amount of 25TB with thousand of users in Shengli oil field to access data to improve work efficiency multiple times. The technology has also been applied by many other units of SINOPEC. Its application of providing data to a project named Exploration achievements and Evaluation of Favorable Targets in Hekou Area shortened the data preparation period from 30 days to 2 days, enriching data abundance 15 percent and getting information support from the database center perfectly. Its application to provide former processed result for a project named Pre-stack depth migration in Guxi fracture zone reduced the amount of repeated process and shortened work period of one month and improved processing precision and quality, saving capital investment of data processing of 30 million yuan. It application by providing project database automatically in project named Geological and seismic study of southern slope zone of Dongying Sag shortened data preparation time so that researchers have more time to do research, thus to improve interpretation precision and quality.
Resumo:
Debris Landslide is one of the types of landslides with the widest distribution, largest quantity, and the closest relationship with engineering construction. It is also one of the most important types of landslides that can cause disaster. This kind of landslide often occurs in the loose slopes which are made up of loose congeries formed by earth filling, residual soil, slope wash, dilapidation, landslide or full weathered material of hard rock. Rainfall is always the chief inducing factor of debris Landslide. Therefore, to research stability of debris Landslide during rainfall not only has important theoretical significance for understanding developing law and deformation and failure mechanism of debris landslide, but also has important practical significance for investigating, appraising, forecasting, preventing and controlling debris landslides. This thesis systematically summarized the relationships between rainfall and landslide, the method to survey water table in the landslides, the deformation and failure mechanism of debris landslide, and the progress in the stability analysis of landslides based on the analyses of data collected widely at home and abroad. The problems in the study of the stability of debris landslide during rainfall was reviewed and discussed. Due to the complicated geological conditions and the random rainfall conditions, the research on the landslides' stability must be based on engineering geological qualitative analysis. Through the collection of the data about the Panxi region and the Three Gorges Reservoir region, the author systematically summarized the engineering geological conditions, hydro-geological condition, distribution characteristics of stress field in the slope, physical and mechanical properties and hydro-mechanical properties of debris. In the viewpoint of dynamics of soil water and hydromechanics, physical process of rainfall to supply groundwater of debris landslides can be divided into two phases, i.e. non-saturated steady infiltrating phase and saturated unsteady supplying phase. The former can be described by mathematical model of surface water infiltration while the latter can be described by equivalent continuous medium model of groundwater seepage. With regard to specific hydrological geology system, we can obtain the dynamic variation law of water content, water table, landslide stability of rock and soil mass, along with quantity and duration of rainfall after the boundary condition on hydrological geology has been ascertained. This is a new way to study the response law of groundwater in the landslides during rainfall. After wet face of rock and soil mass connects with ground water table, the raising of water table will occur due to the supply of rainfall. Then interaction between ground water and rock and soil mass will occur, such as the action of physics, water, chemistry and mechanics, which caused the decrease of shearing strength of sliding zone. According to the action of groundwater on rock and soil mass, a concise mechanical model of debris landslide’s deformation was established during rainfall. The static equilibrium condition of landslide mass system was achieved according to the concise mechanical model, and then the typical deformation and failure process and failure mode of debris landslide during rainfall were discussed. In this thesis, the former limiting equilibrium slice method was modified and improved based on shearing strength theory of , a stability analysis program of debris landslide was established and developed taking account of the saturated-unsaturated seepage, by introducing the shearing strength theory of unsaturated soil mass made by (1978). The program has reasonable data storage and simple interface and is easy to operate, and can be perfectly used to carry out sensitivity analysis of influencing factors of landslides' stability, integrated with the program of Office Excel. The design of drainage engineering are always bases on empirical methods and is short of effective quantitative analysis and appraise, therefore, the conception of critical water table of debris landslide was put forward. For debris landslides with different kinds of slide face in the engineering practice, a program to search the critical water table of debris landslide was developed based on native groundwater table. And groundwater table in the slope should be declined below the critical water table in the drainage works, so the program can be directly used to guide drainage works in the debris landslide. Taking the slope deformation body in the back of former factory building of Muli Shawan hydroelectric power station as an example, a systematic and detailed research on debris landslides' stability during rainfall was researched systematically, the relationship among quantity of rainfall, water table and stability of slope was established, the debris landslides' stability in process of rainfall from dynamic viewpoint was analyzed and researched.
Resumo:
A Persistent Node is a redundant distributed mechanism for storing a key/value pair reliably in a geographically local network. In this paper, I develop a method of establishing Persistent Nodes in an amorphous matrix. I address issues of construction, usage, atomicity guarantees and reliability in the face of stopping failures. Applications include routing, congestion control, and data storage in gigascale networks.
Resumo:
For a very large network deployed in space with only nearby nodes able to talk to each other, we want to do tasks like robust routing and data storage. One way to organize the network is via a hierarchy, but hierarchies often have a few critical nodes whose death can disrupt organization over long distances. I address this with a system of distributed aggregates called Persistent Nodes, such that spatially local failures disrupt the hierarchy in an area proportional to the diameter of the failure. I describe and analyze this system, which has been implemented in simulation.
Resumo:
We present an algorithm to store data robustly in a large, geographically distributed network by means of localized regions of data storage that move in response to changing conditions. For example, data might migrate away from failures or toward regions of high demand. The PersistentNode algorithm provides this service robustly, but with limited safety guarantees. We use the RAMBO framework to transform PersistentNode into RamboNode, an algorithm that guarantees atomic consistency in exchange for increased cost and decreased liveness. In addition, a half-life analysis of RamboNode shows that it is robust against continuous low-rate failures. Finally, we provide experimental simulations for the algorithm on 2000 nodes, demonstrating how it services requests and examining how it responds to failures.
Resumo:
Dissertação apresentada à Universidade Fernando Pessoa como partes dos requisitos para a obtenção do grau de Mestre em Engenharia Informática, ramo de Sistemas de Informação e Multimédia
Resumo:
In this project we design and implement a centralized hashing table in the snBench sensor network environment. We discuss the feasibility of this approach and compare and contrast with the distributed hashing architecture, with particular discussion regarding the conditions under which a centralized architecture makes sense. There are numerous computational tasks that require persistence of data in a sensor network environment. To help motivate the need for data storage in snBench we demonstrate a practical application of the technology whereby a video camera can monitor a room to detect the presence of a person and send an alert to the appropriate authorities.
Resumo:
Error correcting codes are combinatorial objects, designed to enable reliable transmission of digital data over noisy channels. They are ubiquitously used in communication, data storage etc. Error correction allows reconstruction of the original data from received word. The classical decoding algorithms are constrained to output just one codeword. However, in the late 50’s researchers proposed a relaxed error correction model for potentially large error rates known as list decoding. The research presented in this thesis focuses on reducing the computational effort and enhancing the efficiency of decoding algorithms for several codes from algorithmic as well as architectural standpoint. The codes in consideration are linear block codes closely related to Reed Solomon (RS) codes. A high speed low complexity algorithm and architecture are presented for encoding and decoding RS codes based on evaluation. The implementation results show that the hardware resources and the total execution time are significantly reduced as compared to the classical decoder. The evaluation based encoding and decoding schemes are modified and extended for shortened RS codes and software implementation shows substantial reduction in memory footprint at the expense of latency. Hermitian codes can be seen as concatenated RS codes and are much longer than RS codes over the same aphabet. A fast, novel and efficient VLSI architecture for Hermitian codes is proposed based on interpolation decoding. The proposed architecture is proven to have better than Kötter’s decoder for high rate codes. The thesis work also explores a method of constructing optimal codes by computing the subfield subcodes of Generalized Toric (GT) codes that is a natural extension of RS codes over several dimensions. The polynomial generators or evaluation polynomials for subfield-subcodes of GT codes are identified based on which dimension and bound for the minimum distance are computed. The algebraic structure for the polynomials evaluating to subfield is used to simplify the list decoding algorithm for BCH codes. Finally, an efficient and novel approach is proposed for exploiting powerful codes having complex decoding but simple encoding scheme (comparable to RS codes) for multihop wireless sensor network (WSN) applications.
Resumo:
Electron microscopy (EM) has advanced in an exponential way since the first transmission electron microscope (TEM) was built in the 1930’s. The urge to ‘see’ things is an essential part of human nature (talk of ‘seeing is believing’) and apart from scanning tunnel microscopes which give information about the surface, EM is the only imaging technology capable of really visualising atomic structures in depth down to single atoms. With the development of nanotechnology the demand to image and analyse small things has become even greater and electron microscopes have found their way from highly delicate and sophisticated research grade instruments to key-turn and even bench-top instruments for everyday use in every materials research lab on the planet. The semiconductor industry is as dependent on the use of EM as life sciences and pharmaceutical industry. With this generalisation of use for imaging, the need to deploy advanced uses of EM has become more and more apparent. The combination of several coinciding beams (electron, ion and even light) to create DualBeam or TripleBeam instruments for instance enhances the usefulness from pure imaging to manipulating on the nanoscale. And when it comes to the analytic power of EM with the many ways the highly energetic electrons and ions interact with the matter in the specimen there is a plethora of niches which evolved during the last two decades, specialising in every kind of analysis that can be thought of and combined with EM. In the course of this study the emphasis was placed on the application of these advanced analytical EM techniques in the context of multiscale and multimodal microscopy – multiscale meaning across length scales from micrometres or larger to nanometres, multimodal meaning numerous techniques applied to the same sample volume in a correlative manner. In order to demonstrate the breadth and potential of the multiscale and multimodal concept an integration of it was attempted in two areas: I) Biocompatible materials using polycrystalline stainless steel and II) Semiconductors using thin multiferroic films. I) The motivation to use stainless steel (316L medical grade) comes from the potential modulation of endothelial cell growth which can have a big impact on the improvement of cardio-vascular stents – which are mainly made of 316L – through nano-texturing of the stent surface by focused ion beam (FIB) lithography. Patterning with FIB has never been reported before in connection with stents and cell growth and in order to gain a better understanding of the beam-substrate interaction during patterning a correlative microscopy approach was used to illuminate the patterning process from many possible angles. Electron backscattering diffraction (EBSD) was used to analyse the crystallographic structure, FIB was used for the patterning and simultaneously visualising the crystal structure as part of the monitoring process, scanning electron microscopy (SEM) and atomic force microscopy (AFM) were employed to analyse the topography and the final step being 3D visualisation through serial FIB/SEM sectioning. II) The motivation for the use of thin multiferroic films stems from the ever-growing demand for increased data storage at lesser and lesser energy consumption. The Aurivillius phase material used in this study has a high potential in this area. Yet it is necessary to show clearly that the film is really multiferroic and no second phase inclusions are present even at very low concentrations – ~0.1vol% could already be problematic. Thus, in this study a technique was developed to analyse ultra-low density inclusions in thin multiferroic films down to concentrations of 0.01%. The goal achieved was a complete structural and compositional analysis of the films which required identification of second phase inclusions (through elemental analysis EDX(Energy Dispersive X-ray)), localise them (employing 72 hour EDX mapping in the SEM), isolate them for the TEM (using FIB) and give an upper confidence limit of 99.5% to the influence of the inclusions on the magnetic behaviour of the main phase (statistical analysis).
Resumo:
A new algorithm for training of nonlinear optimal neuro-controllers (in the form of the model-free, action-dependent, adaptive critic paradigm). Overcomes problems with existing stochastic backpropagation training: need for data storage, parameter shadowing and poor convergence, offering significant benefits for online applications.
Resumo:
A stencilling technique for depositing arrays of nanoscale ferroelectric capacitors on a surface could be useful in data storage devices.
Resumo:
Topographic and optical contrasts formed by Ga+ ion irradiation of thin films of amorphous silicon carbide have been investigated with scanning near-field optical microscopy. The influence of ion-irradiation dose has been studied in a pattern of sub-micrometre stripes. While the film thickness decreases monotonically with ion dose, the optical contrast rapidly increases to a maximum value and then decreases gradually. The results are discussed in terms of the competition between the effects of ion implantation and surface milling by the ion beam. The observed effects are important for uses of amorphous silicon carbide thin films as permanent archives in optical data storage applications.