957 resultados para GLOBALLY HYPERBOLIC SPACETIMES


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The seismic survey is the most effective geophysical method during exploration and development of oil/gas. As a main means in processing and interpreting seismic data, impedance inversion takes up a special position in seismic survey. This is because the impedance parameter is a ligament which connects seismic data with well-logging and geological information, while it is also essential in predicting reservoir properties and sand-body. In fact, the result of traditional impedance inversion is not ideal. This is because the mathematical inverse problem of impedance is poor-pose so that the inverse result has instability and multi-result, so it is necessary to introduce regularization. Most simple regularizations are presented in existent literature, there is a premise that the image(or model) is globally smooth. In fact, as an actual geological model, it not only has made of smooth region but also be separated by the obvious edge, the edge is very important attribute of geological model. It's difficult to preserve these characteristics of the model and to avoid an edge too smooth to clear. Thereby, in this paper, we propose a impedance inverse method controlled by hyperparameters with edge-preserving regularization, the inverse convergence speed and result would be improved. In order to preserve the edge, the potential function of regularization should satisfy nine conditions such as basic assumptions edge preservation and convergence assumptions etc. Eventually, a model with clear background and edge-abnormity can be acquired. The several potential functions and the corresponding weight functions are presented in this paper. The potential functionφLφHL andφGM can meet the need of inverse precision by calculating the models. For the local constant planar and quadric models, we respectively present the neighborhood system of Markov random field corresponding to the regularization term. We linearity nonlinear regularization by using half-quadratic regularization, it not only preserve the edge, and but also simplify the inversion, and can use some linear methods. We introduced two regularization parameters (or hyperparameters) λ2 and δ in the regularization term. λ2 is used to balance the influence between the data term and the transcendental term; δ is a calibrating parameter used to adjust the gradient value at the discontinuous position(or formation interface). Meanwhile, in the inverse procedure, it is important to select the initial value of hyperparameters and to change hyperparameters, these will then have influence on convergence speed and inverse effect. In this paper, we roughly give the initial value of hyperparameters by using a trend- curve of φ-(λ2, δ) and by a method of calculating the upper limit value of hyperparameters. At one time, we change hyperparameters by using a certain coefficient or Maximum Likelihood method, this can be simultaneously fulfilled with the inverse procedure. Actually, we used the Fast Simulated Annealing algorithm in the inverse procedure. This method overcame restrictions from the local extremum without depending on the initial value, and got a global optimal result. Meanwhile, we expound in detail the convergence condition of FSA, the metropolis receiving probability form Metropolis-Hasting, the thermal procession based on the Gibbs sample and other methods integrated with FSA. These content can help us to understand and improve FSA. Through calculating in the theoretic model and applying it to the field data, it is proved that the impedance inverse method in this paper has the advantage of high precision practicability and obvious effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study of biogenic proxy of tropical and subtropical regions provides important evidence about the process and history of vegetation and environmental changes, and is of globally importance for understanding the dynamic mechanism of paleoclimatic and paleoenvironmental changes. The sediments from the Huguangyan Maar lake in Guangdong Province offer a continuous high-resolution record of the past 55 ka about environmental and vegetational changes. The studies of chronology, and physical, chemical environmental proxies have provided much important information about the paleoenvironmental and paleoclimatic histories. The phytolith, a new biogenicl proxy, has been used to determine the nature and types of plants in this area since the last 55 ka. This study presents a preliminary result about the characteristics of phytolith shapes, the variations of the fossils assemblages, and their significance for environmental changes. Moreover, the author probes the process of special specie evolution and their relationship to climatic parameters. The history of fire has been reconstructed based on the variations in charcoals. The main results and conclusions include: 28 types of phytoliths from 233 samples have been identified. Their environmental meanings are investigated in detail. Based on the variations in phytolith associations, the history and process of climatic and environmental changes in the last 55 kaBP have been established for this region. Climatic changes experienced eight intervals during this period, showing the variations of hot-humid to cool-try climate in the ten thousands years scale, and a shorter dry-hot climate condition in millennial scale. The history of palm plant has been established in this region. Two peaks appeared from 55-39 ka and since the Holocene. Plants in Bambusoideae have been growing in this area all the period, representing the impact of the East Asian summer monsoon. Bamboo plants have similar tendency in their abundance to palm plants, but with a lag of 1-2 ka BP. Panicoideae plants, the representative of C4 plants, have 6 flourishing periods occurred at 54.5, 44, 41.5, 32.5, 14, and 10 kaBP, respectively, reflecting 6 times short-term arid events. Charcoal record from the Huguang Marr lake reveals the history of nature fire, that mostly happened in dry period of last glacial from 55-10 kaBP, centered at 50-45, 40-35, 30-25, and 20-15kaBP, showing about a cycle of 10,000 years.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We combine theories of optimal pump-dump control and the related transient probe absorption spectroscopy in order to elucidate the relation between these two optical processes and the possibility of experimental realization. In the weak response regime, we identify the globally optimal pair of pump-dump control fields, and further propose a second-order difference detection scheme to monitor the wave packets dynamics that is jointly controlled by both the pump and dump fields. The globally optimal solution serves also as the initial input for the iterative search for the optimal control fields in the strong response regime. We use a model I-2 molecule to demonstrate numerically the pump-dump control and the detection of a highly vibrationally excited wave packet focusing dynamics on the ground X surface in both the weak and strong response regimes. The I2B surface serves as the intermediate to assist the pump-dump control and the optical detection processes. Demonstrated in the strong response regime are the optimal pair of pump-dump molecular-pi pulses that invert nearly total population onto the predefined target region within a half period of vibration motion. (C) 1999 American Institute of Physics. [S0021-9606(99)00115-4].

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Structure from motion often refers to the computation of 3D structure from a matched sequence of images. However, a depth map of a surface is difficult to compute and may not be a good representation for storage and recognition. Given matched images, I will first show that the sign of the normal curvature in a given direction at a given point in the image can be computed from a simple difference of slopes of line-segments in one image. Using this result, local surface patches can be classified as convex, concave, parabolic (cylindrical), hyperbolic (saddle point) or planar. At the same time the translational component of the optical flow is obtained, from which the focus of expansion can be computed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recognition of objects with smooth bounding surfaces from their contour images is considerably more complicated than that of objects with sharp edges, since in the former case the set of object points that generates the silhouette contours changes from one view to another. The "curvature method", developed by Basri and Ullman [1988], provides a method to approximate the appearance of such objects from different viewpoints. In this paper we analyze the curvature method. We apply the method to ellipsoidal objects and compute analytically the error obtained for different rotations of the objects. The error depends on the exact shape of the ellipsoid (namely, the relative lengths of its axes), and it increases a sthe ellipsoid becomes "deep" (elongated in the Z-direction). We show that the errors are usually small, and that, in general, a small number of models is required to predict the appearance of an ellipsoid from all possible views. Finally, we show experimentally that the curvature method applies as well to objects with hyperbolic surface patches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

MIT SchMUSE (pronounced "shmooz") is a concurrent, distributed, delegation-based object-oriented interactive environment with persistent storage. It is designed to run in a "capricious" network environment, where servers can migrate from site to site and can regularly become unavailable. Our design introduces a new form of unique identifiers called "globally unique tickets" that provide globally unique time/space stamps for objects and classes without being location specific. Object location is achieved by a distributed hierarchical lazy lookup mechanism that we call "realm resolution." We also introduce a novel mechanism called "message deferral" for enhanced reliability in the face of remote delegation. We conclude with a comparison to related work and a projection of future work on MIT SchMUSE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis investigates a new approach to lattice basis reduction suggested by M. Seysen. Seysen's algorithm attempts to globally reduce a lattice basis, whereas the Lenstra, Lenstra, Lovasz (LLL) family of reduction algorithms concentrates on local reductions. We show that Seysen's algorithm is well suited for reducing certain classes of lattice bases, and often requires much less time in practice than the LLL algorithm. We also demonstrate how Seysen's algorithm for basis reduction may be applied to subset sum problems. Seysen's technique, used in combination with the LLL algorithm, and other heuristics, enables us to solve a much larger class of subset sum problems than was previously possible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Parallel shared-memory machines with hundreds or thousands of processor-memory nodes have been built; in the future we will see machines with millions or even billions of nodes. Associated with such large systems is a new set of design challenges. Many problems must be addressed by an architecture in order for it to be successful; of these, we focus on three in particular. First, a scalable memory system is required. Second, the network messaging protocol must be fault-tolerant. Third, the overheads of thread creation, thread management and synchronization must be extremely low. This thesis presents the complete system design for Hamal, a shared-memory architecture which addresses these concerns and is directly scalable to one million nodes. Virtual memory and distributed objects are implemented in a manner that requires neither inter-node synchronization nor the storage of globally coherent translations at each node. We develop a lightweight fault-tolerant messaging protocol that guarantees message delivery and idempotence across a discarding network. A number of hardware mechanisms provide efficient support for massive multithreading and fine-grained synchronization. Experiments are conducted in simulation, using a trace-driven network simulator to investigate the messaging protocol and a cycle-accurate simulator to evaluate the Hamal architecture. We determine implementation parameters for the messaging protocol which optimize performance. A discarding network is easier to design and can be clocked at a higher rate, and we find that with this protocol its performance can approach that of a non-discarding network. Our simulations of Hamal demonstrate the effectiveness of its thread management and synchronization primitives. In particular, we find register-based synchronization to be an extremely efficient mechanism which can be used to implement a software barrier with a latency of only 523 cycles on a 512 node machine.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conventional parallel computer architectures do not provide support for non-uniformly distributed objects. In this thesis, I introduce sparsely faceted arrays (SFAs), a new low-level mechanism for naming regions of memory, or facets, on different processors in a distributed, shared memory parallel processing system. Sparsely faceted arrays address the disconnect between the global distributed arrays provided by conventional architectures (e.g. the Cray T3 series), and the requirements of high-level parallel programming methods that wish to use objects that are distributed over only a subset of processing elements. A sparsely faceted array names a virtual globally-distributed array, but actual facets are lazily allocated. By providing simple semantics and making efficient use of memory, SFAs enable efficient implementation of a variety of non-uniformly distributed data structures and related algorithms. I present example applications which use SFAs, and describe and evaluate simple hardware mechanisms for implementing SFAs. Keeping track of which nodes have allocated facets for a particular SFA is an important task that suggests the need for automatic memory management, including garbage collection. To address this need, I first argue that conventional tracing techniques such as mark/sweep and copying GC are inherently unscalable in parallel systems. I then present a parallel memory-management strategy, based on reference-counting, that is capable of garbage collecting sparsely faceted arrays. I also discuss opportunities for hardware support of this garbage collection strategy. I have implemented a high-level hardware/OS simulator featuring hardware support for sparsely faceted arrays and automatic garbage collection. I describe the simulator and outline a few of the numerous details associated with a "real" implementation of SFAs and SFA-aware garbage collection. Simulation results are used throughout this thesis in the evaluation of hardware support mechanisms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis describes some aspects of a computer system for doing medical diagnosis in the specialized field of kidney disease. Because such a system faces the spectre of combinatorial explosion, this discussion concentrates on heuristics which control the number of concurrent hypotheses and efficient "compiled" representations of medical knowledge. In particular, the differential diagnosis of hematuria (blood in the urine) is discussed in detail. A protocol of a simulated doctor/patient interaction is presented and analyzed to determine the crucial structures and processes involved in the diagnosis procedure. The data structure proposed for representing medical information revolves around elementary hypotheses which are activated when certain disposing of findings, activating hypotheses, evaluating hypotheses locally and combining hypotheses globally is examined for its heuristic implications. The thesis attempts to fit the problem of medical diagnosis into the framework of other Artifcial Intelligence problems and paradigms and in particular explores the notions of pure search vs. heuristic methods, linearity and interaction, local vs. global knowledge and the structure of hypotheses within the world of kidney disease.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Web threats are becoming a major issue for both governments and companies. Generally, web threats increased as much as 600% during last year (WebSense, 2013). This appears to be a significant issue, since many major businesses seem to provide these services. Denial of Service (DoS) attacks are one of the most significant web threats and generally their aim is to waste the resources of the target machine (Mirkovic & Reiher, 2004). Dis-tributed Denial of Service (DDoS) attacks are typically executed from many sources and can result in large traf-fic flows. During last year 11% of DDoS attacks were over 60 Gbps (Prolexic, 2013a). The DDoS attacks are usually performed from the large botnets, which are networks of remotely controlled computers. There is an increasing effort by governments and companies to shut down the botnets (Dittrich, 2012), which has lead the attackers to look for alternative DDoS attack methods. One of the techniques to which attackers are returning to is DDoS amplification attacks. Amplification attacks use intermediate devices called amplifiers in order to amplify the attacker's traffic. This work outlines an evaluation tool and evaluates an amplification attack based on the Trivial File Transfer Proto-col (TFTP). This attack could have amplification factor of approximately 60, which rates highly alongside other researched amplification attacks. This could be a substantial issue globally, due to the fact this protocol is used in approximately 599,600 publicly open TFTP servers. Mitigation methods to this threat have also been consid-ered and a variety of countermeasures are proposed. Effects of this attack on both amplifier and target were analysed based on the proposed metrics. While it has been reported that the breaching of TFTP would be possible (Schultz, 2013), this paper provides a complete methodology for the setup of the attack, and its verification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Grattan, J.P., Gilbertson, D.D., Hunt, C.O. (2007). The local and global dimensions of metaliferrous air pollution derived from a reconstruction of an 8 thousand year record of copper smelting and mining at a desert-mountain frontier in southern Jordan. Journal of Archaeological Science 34, 83-110

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Booth, Ken, Theory of World Security (Cambridge: Cambridge University Press, 2008), pp.xviii+489 RAE2008

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Shepherd, Alistair, and T. C. Salmon, Toward a European Army: A Military Power in the Making? (Boulder, CO: Lynne Rienner, 2003), pp.x+237 RAE2008