892 resultados para One-way Quantum Computer
Resumo:
On 70~(th) SEG Annual meeting, many author have announced their result on the wave equation pre-stack depth migration. The methods of the wave-field imaging base on wave equation becomes mature and the main direction of seismic imaging. The direction of imaging the complex media has been the main one of the projects that the national "85" and "95" reservoir geophysics key projects and "Knowledge innovation key project of Chinese Academy of Science" have been supported. Furthermore, we began the study for special oil field situation of our nation with the international research groups. Under the background, the author combined the thoughts of symplectic with wave equation pre-stack depth migration, and develops and efficient wave equation pre-stack depth migration method. The purpose of this work is to find out a way to imaging the complex geological goals of Chinese oilfields and form a procedure of seismic data processing. The paper gives the approximation of one way wave equation operator, and shows the numerical results. The comparisons have been made between split-step phase method, Kirchhoff and Ray+FD methods on the pulse response, simple model and Marmousi model. The result shows that the method in this paper has an higher accuracy. Four field data examples have also be given in this paper. The results of field data demonstrate that the method can be usable. The velocity estimation is an important part of the wave equation pre-stack depth migration. A. parallel velocity estimation program has been written and tested on the Beowulf clusters. The program can establish a velocity profile automatically. An example on Marmousi model has shown in the third part of the paper to demonstrate the method. Another field data was also given in the paper. Beowulf cluster is the converge of the high performance computer architecture. Today, Beowulf Cluster is a good choice for institutes and small companies to finish their task. The paper gives some comparison results the computation of the wave equation pre-stack migration on Beowulf cluster, IBM-SP2 (24 nodes) in Daqing and Shuguang3000, and the comparison of their prize. The results show that the Beowulf cluster is an efficient way to finish the large amount computation of the wave equation pre-stack depth migration, especially for 3D.
Resumo:
Establishing correspondences among object instances is still challenging in multi-camera surveillance systems, especially when the cameras’ fields of view are non-overlapping. Spatiotemporal constraints can help in solving the correspondence problem but still leave a wide margin of uncertainty. One way to reduce this uncertainty is to use appearance information about the moving objects in the site. In this paper we present the preliminary results of a new method that can capture salient appearance characteristics at each camera node in the network. A Latent Dirichlet Allocation (LDA) model is created and maintained at each node in the camera network. Each object is encoded in terms of the LDA bag-of-words model for appearance. The encoded appearance is then used to establish probable matching across cameras. Preliminary experiments are conducted on a dataset of 20 individuals and comparison against Madden’s I-MCHR is reported.
Resumo:
Constraint programming has emerged as a successful paradigm for modelling combinatorial problems arising from practical situations. In many of those situations, we are not provided with an immutable set of constraints. Instead, a user will modify his requirements, in an interactive fashion, until he is satisfied with a solution. Examples of such applications include, amongst others, model-based diagnosis, expert systems, product configurators. The system he interacts with must be able to assist him by showing the consequences of his requirements. Explanations are the ideal tool for providing this assistance. However, existing notions of explanations fail to provide sufficient information. We define new forms of explanations that aim to be more informative. Even if explanation generation is a very hard task, in the applications we consider, we must manage to provide a satisfactory level of interactivity and, therefore, we cannot afford long computational times. We introduce the concept of representative sets of relaxations, a compact set of relaxations that shows the user at least one way to satisfy each of his requirements and at least one way to relax them, and present an algorithm that efficiently computes such sets. We introduce the concept of most soluble relaxations, maximising the number of products they allow. We present algorithms to compute such relaxations in times compatible with interactivity, achieving this by indifferently making use of different types of compiled representations. We propose to generalise the concept of prime implicates to constraint problems with the concept of domain consequences, and suggest to generate them as a compilation strategy. This sets a new approach in compilation, and allows to address explanation-related queries in an efficient way. We define ordered automata to compactly represent large sets of domain consequences, in an orthogonal way from existing compilation techniques that represent large sets of solutions.
Resumo:
An aim of proactive risk management strategies is the timely identification of safety related risks. One way to achieve this is by deploying early warning systems. Early warning systems aim to provide useful information on the presence of potential threats to the system, the level of vulnerability of a system, or both of these, in a timely manner. This information can then be used to take proactive safety measures. The United Nation’s has recommended that any early warning system need to have four essential elements, which are the risk knowledge element, a monitoring and warning service, dissemination and communication and a response capability. This research deals with the risk knowledge element of an early warning system. The risk knowledge element of an early warning system contains models of possible accident scenarios. These accident scenarios are created by using hazard analysis techniques, which are categorised as traditional and contemporary. The assumption in traditional hazard analysis techniques is that accidents are occurred due to a sequence of events, whereas, the assumption of contemporary hazard analysis techniques is that safety is an emergent property of complex systems. The problem is that there is no availability of a software editor which can be used by analysts to create models of accident scenarios based on contemporary hazard analysis techniques and generate computer code that represent the models at the same time. This research aims to enhance the process of generating computer code based on graphical models that associate early warning signs and causal factors to a hazard, based on contemporary hazard analyses techniques. For this purpose, the thesis investigates the use of Domain Specific Modeling (DSM) technologies. The contributions of this thesis is the design and development of a set of three graphical Domain Specific Modeling languages (DSML)s, that when combined together, provide all of the necessary constructs that will enable safety experts and practitioners to conduct hazard and early warning analysis based on a contemporary hazard analysis approach. The languages represent those elements and relations necessary to define accident scenarios and their associated early warning signs. The three DSMLs were incorporated in to a prototype software editor that enables safety scientists and practitioners to create and edit hazard and early warning analysis models in a usable manner and as a result to generate executable code automatically. This research proves that the DSM technologies can be used to develop a set of three DSMLs which can allow user to conduct hazard and early warning analysis in more usable manner. Furthermore, the three DSMLs and their dedicated editor, which are presented in this thesis, may provide a significant enhancement to the process of creating the risk knowledge element of computer based early warning systems.
Resumo:
We obtain an upper bound on the time available for quantum computation for a given quantum computer and decohering environment with quantum error correction implemented. First, we derive an explicit quantum evolution operator for the logical qubits and show that it has the same form as that for the physical qubits but with a reduced coupling strength to the environment. Using this evolution operator, we find the trace distance between the real and ideal states of the logical qubits in two cases. For a super-Ohmic bath, the trace distance saturates, while for Ohmic or sub-Ohmic baths, there is a finite time before the trace distance exceeds a value set by the user. © 2010 The American Physical Society.
Resumo:
Aerodynamic generation of sound is governed by the Navier–Stokes equations while acoustic propagation in a non-uniform medium is effectively described by the linearised Euler equations. Different numerical schemes are required for the efficient solution of these two sets of equations, and therefore, coupling techniques become an essential issue. Two types of one-way coupling between the flow solver and the acoustic solver are discussed: (a) for aerodynamic sound generated at solid surfaces, and (b) in the free stream. Test results indicate how the coupling achieves the necessary accuracy so that Computational Fluid Dynamics codes can be used in aeroacoustic simulations.
Resumo:
Computational modelling of dynamic fluid-structure interaction (DFSI) is problematical since conventionally computational fluid dynamics (CFD) is solved using finite volume (FV) methods and computational structural mechanics (CSM) is based entirely on finite element (FE) methods. Hence, progress in modelling the emerging multi-physics problem of dynamic fluid-structure interaction in a consistent manner is frustrated and significant problems in computation convergence may be encountered in transferring and filtering data from one mesh and solution procedure to another, unless the fluid-structure coupling is either one way, very weak or both. This paper sets out the solution procedure for modelling the multi-physics dynamic fluid-structure interaction problem within a single software framework PHYSICA, using finite volume, unstructured mesh (FV-UM) procedures and will focus upon some of the problems and issues that have to be resolved for time accurate closely coupled dynamic fluid-structure flutter analysis.
Resumo:
Fluid structure interaction, as applied to flexible structures, has wide application in diverse areas such as flutter in aircraft, flow in elastic pipes and blood vessels and extrusion of metals through dies. However a comprehensive computational model of these multi-physics phenomena is a considerable challenge. Until recently work in this area focused on one phenomenon and represented the behaviour of the other more simply even to the extent in metal forming, for example, that the deformation of the die is totally ignored. More recently, strategies for solving the full coupling between the fluid and soild mechanics behaviour have developed. Conventionally, the computational modelling of fluid structure interaction is problematical since computational fluid dynamics (CFD) is solved using finite volume (FV) methods and computational structural mechanics (CSM) is based entirely on finite element (FE) methods. In the past the concurrent, but rather disparate, development paths for the finite element and finite volume methods have resulted in numerical software tools for CFD and CSM that are different in almost every respect. Hence, progress is frustrated in modelling the emerging multi-physics problem of fluid structure interaction in a consistent manner. Unless the fluid-structure coupling is either one way, very weak or both, transferring and filtering data from one mesh and solution procedure to another may lead to significant problems in computational convergence. Using a novel three phase technique the full interaction between the fluid and the dynamic structural response are represented. The procedure is demonstrated on some challenging applications in complex three dimensional geometries involving aircraft flutter, metal forming and blood flow in arteries.
Resumo:
Fractal video compression is a relatively new video compression method. Its attraction is due to the high compression ratio and the simple decompression algorithm. But its computational complexity is high and as a result parallel algorithms on high performance machines become one way out. In this study we partition the matching search, which occupies the majority of the work in a fractal video compression process, into small tasks and implement them in two distributed computing environments, one using DCOM and the other using .NET Remoting technology, based on a local area network consists of loosely coupled PCs. Experimental results show that the parallel algorithm is able to achieve a high speedup in these distributed environments.
Resumo:
Common Learning Management Systems (for example Moodle [1] and Blackboard [2]) are limited in the amount of personalisation that they can offer the learner. They are used widely and do offer a number of tools for instructors to enable them to create and manage courses, however, they do not allow for the learner to have a unique personalised learning experience. The e-Learning platform iLearn offers personalisation for the learner in a number of ways and one way is to offer the specific learning material to the learner based on the learner's learning style. Learning styles and how we learn is a vast research area. Brusilovsky and Millan [3] state that learning styles are typically defined as the way people prefer to learn. Examples of commonly used learning styles are Kolb Learning Styles Theory [4], Felder and Silverman Index of Learning Styles [5], VARK [6] and Honey and Mumford Index of Learning Styles [7] and many research projects (SMILE [8], INSPIRE [9], iWeaver [10] amonst others) attempt to incorporate these learning styles into adaptive e-Learning systems. This paper describes how learning styles are currently being used within the area of adaptive e-Learning. The paper then gives an overview of the iLearn project and also how iLearn is using the VARK learning style to enhance the platform's personalisation and adaptability for the learner. This research also describes the system's design and how the learning style is incorporated into the system design and semantic framework within the learner's profile.
Resumo:
We perform an extensive study of the properties of global quantum correlations in finite-size one-dimensional quantum spin models at finite temperature. By adopting a recently proposed measure for global quantum correlations (Rulli and Sarandy 2011 Phys. Rev. A 84 042109), called global discord, we show that critical points can be neatly detected even for many-body systems that are not in their ground state. We consider the transverse Ising model, the cluster-Ising model where three-body couplings compete with an Ising-like interaction, and the nearest-neighbor XX Hamiltonian in transverse magnetic field. These models embody our canonical examples showing the sensitivity of global quantum discord close to criticality. For the Ising model, we find a universal scaling of global discord with the critical exponents pertaining to the Ising universality class.
Resumo:
Cloud computing technology has rapidly evolved over the last decade, offering an alternative way to store and work with large amounts of data. However data security remains an important issue particularly when using a public cloud service provider. The recent area of homomorphic cryptography allows computation on encrypted data, which would allow users to ensure data privacy on the cloud and increase the potential market for cloud computing. A significant amount of research on homomorphic cryptography appeared in the literature over the last few years; yet the performance of existing implementations of encryption schemes remains unsuitable for real time applications. One way this limitation is being addressed is through the use of graphics processing units (GPUs) and field programmable gate arrays (FPGAs) for implementations of homomorphic encryption schemes. This review presents the current state of the art in this promising new area of research and highlights the interesting remaining open problems.
Resumo:
O desenvolvimento deste trabalho teve como objectivo a optimização de um sistema de climatização industrial, constituído por quatro centrais de climatização adiabáticas, que apresentam limitações de capacidade de arrefecimento, controlo e eficiência. Inicialmente foi necessária a pesquisa bibliográfica e recolha de informação relativa à indústria têxtil e ao processo de arrefecimento evaporativo. Numa fase posterior foram recolhidos e analisados os diversos dados essenciais à compreensão do binómio edifício/sistema de climatização, para a obtenção de possíveis hipóteses de optimização. Da fase de recolha de informações e dados, destaca-se, também, a realização de análises à qualidade do ar interior (QAI). As optimizações seleccionadas como passíveis de implementação, foram estudadas e analisadas com o auxílio do software de simulação energética dinâmica DesignBuilder e os resultados obtidos foram devidamente trabalhados e ajustados de modo a permitir uma assimilação amigável e de fácil interpretação das suas vantagens e desvantagens, tendo ainda sido objecto de estudo de viabilidade económica. A optimização proposta reflecte uma melhoria substancial das condições interiores ao nível da temperatura e humidade relativa, resultando, ainda assim, numa redução de consumos energéticos na ordem dos 23 % (490.337 kWh), isto é, uma poupança anual de 42.169 € aos custos de exploração e com um período de retorno de 1 ano e 11 meses.
Resumo:
The effectiveness of various kinds of computer programs is of concern to nurse-educators. Using a 3x3 experimental design, ninety second year diploma student nurses were randomly selected from a total population at three community colleges in Ontario. Data were collected via a 20-item valid and reliable Likert-type questionnaire developed by the nursing profession to measure perceptions of nurses about computers in the nursing role. The groups were pretested and posttested at the beginning and end of one semester. Subjects attending College A group received a computer literacy course which comprised word processing with technology awareness. College B students were exposed to computer-aided instruction primarily in nursing simulations intermittently throughout the semester. College C subjects maintained their regular curriculum with no computer involvement. The student's t-test (two-tailed) was employed to assess the attitude scores data and a one-way analysis of variance was performed on the attitude scores. Posttest analysis revealed that there was a significant difference (p<.05) between attitude scores on the use of computers in the nursing role between College A and C. No significant differences (p>.05) were seen between College B and A in posttesting. Suggestions for continued computer education of diploma student nurses are provided.