923 resultados para Quaternary Convergence


Relevância:

20.00% 20.00%

Publicador:

Resumo:

On the basis of accelerator mass spectrometer radiocarbon (AMS C-14) dating, sedimentation rates of 11 cores collected from the northern to southern Okinawa Trough are discussed. The sedimentation rates in the Okinawa Trough roughly range from 11 to 39 cm/ka, and the average is 23.0 cm/ka. China's continental matter is the main sediment source of the middle Okinawa Trough and has important contribution to the northern and southern Okinawa Trough. The sedimentation rates during the marine oxygen isotope (MI5) 2 are uniformly higher than those during MIS 1 in the northern and middle Okinawa Trough while they are on the contrary in the southern Okinawa Trough. Sedimentation rates in the Okinawa Trough can be one of the proxies of sediment source and an indicator of cooling events.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Based on the analyses of foraminifer and accelerator mass spectrometer radiocarbon dating in DGKS9603 core from mid-Okinawa Trough close to bottom, oscillation curve, which expressed the relation between the surface water temperature and the depth, has been obtained by using foraminifer analysis and calculation of FP-12E transfer function. The whole core indicated seven cold phases and eight warm phases. Obvious expression of low temperature event during Middle and Late Holocene, YD,H1,H2,H3 and H4 events, as well as the short cold phase during the middle last glacial period, implied that short shifts since 50 kaBP would have been global significance. Sedimentation rate during cold phases is usually faster than that in warm stages, with the lowest rate in Holocene, which may be connected with rising sea level and principal axial of Kuroshio Current moving to west. Volcanic activities highly developed in Okinawa Trough during the Quaternary period, thus abundant volcanic glass and pumice were well preserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The evolution and variation history of the Tsushima warm current during the late Quaternary was reconstructed based on the quantitative census data of planktonic foraminiferal fauna, together with oxygen and carbon isotope records of mixed layer dweller a ruber and thermocline dweller N. dutertrei in piston core CSH1 and core DGKS9603 collected separately from the Tsushima warm current and the Kuroshio dominated area. The result showed that the Tsushima warm current vanished in the lowstand period during 40-24 cal ka BP, while the Kuroshio still flowed across the Okinawa Trough, arousing strong upwelling in the northern Trough. Meanwhile, the influence of freshwater greatly increased in the northern Okinawa Trough, as the broad East China Sea continental shelf emerged. The freshwater reached its maximum during the last glacial maximum (LGM), when the upwelling obviously weakened for the lowest sea-level and the depression of the Kuroshio. The modern Tsushima warm current began its development since 16 cal ka BP, and the impact of the Kuroshio increased in the middle and northern Okinawa Trough synchronously during the deglaciation and gradually evolved as the main water source of the Tsushima current. The modern Tsushima current finally formed at about 8.5 cal ka BP, since then the circulation structure has been relatively stable. The water of the modern Tsushima current primarily came from the Kuroshio axis. A short-term wiggle of the current occurred at about 3 cal ka BP, probably for the influences from the enhancement of the winter monsoon and the depression of the Kuroshio. The cold water masses greatly strengthened during the wiggle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Expectation-Maximization (EM) algorithm is an iterative approach to maximum likelihood parameter estimation. Jordan and Jacobs (1993) recently proposed an EM algorithm for the mixture of experts architecture of Jacobs, Jordan, Nowlan and Hinton (1991) and the hierarchical mixture of experts architecture of Jordan and Jacobs (1992). They showed empirically that the EM algorithm for these architectures yields significantly faster convergence than gradient ascent. In the current paper we provide a theoretical analysis of this algorithm. We show that the algorithm can be regarded as a variable metric algorithm with its searching direction having a positive projection on the gradient of the log likelihood. We also analyze the convergence of the algorithm and provide an explicit expression for the convergence rate. In addition, we describe an acceleration technique that yields a significant speedup in simulation experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

McLaren, S. Gilbertson, D. Grattan, J. Hunt, C. Duller, G. Barker, G. Quaternary palaeogeomorphologic evolution of the Wadi Faynan area, Southern Jordan. Palaeogeography, Palaeoclimatology, Palaeoecology. 2004. 205. pp 131-154

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Koven, M. (2007). Most Haunted and the Convergence of Traditional Belief and Popular Television. Folklore. 118(2), pp.183-202. RAE2008

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate knowledge of traffic demands in a communication network enables or enhances a variety of traffic engineering and network management tasks of paramount importance for operational networks. Directly measuring a complete set of these demands is prohibitively expensive because of the huge amounts of data that must be collected and the performance impact that such measurements would impose on the regular behavior of the network. As a consequence, we must rely on statistical techniques to produce estimates of actual traffic demands from partial information. The performance of such techniques is however limited due to their reliance on limited information and the high amount of computations they incur, which limits their convergence behavior. In this paper we study strategies to improve the convergence of a powerful statistical technique based on an Expectation-Maximization iterative algorithm. First we analyze modeling approaches to generating starting points. We call these starting points informed priors since they are obtained using actual network information such as packet traces and SNMP link counts. Second we provide a very fast variant of the EM algorithm which extends its computation range, increasing its accuracy and decreasing its dependence on the quality of the starting point. Finally, we study the convergence characteristics of our EM algorithm and compare it against a recently proposed Weighted Least Squares approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increased diversity of Internet application requirements has spurred recent interests in flexible congestion control mechanisms. Window-based congestion control schemes use increase rules to probe available bandwidth, and decrease rules to back off when congestion is detected. The parameterization of these control rules is done so as to ensure that the resulting protocol is TCP-friendly in terms of the relationship between throughput and packet loss rate. In this paper, we propose a novel window-based congestion control algorithm called SIMD (Square-Increase/Multiplicative-Decrease). Contrary to previous memory-less controls, SIMD utilizes history information in its control rules. It uses multiplicative decrease but the increase in window size is in proportion to the square of the time elapsed since the detection of the last loss event. Thus, SIMD can efficiently probe available bandwidth. Nevertheless, SIMD is TCP-friendly as well as TCP-compatible under RED, and it has much better convergence behavior than TCP-friendly AIMD and binomial algorithms proposed recently.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Border Gateway Protocol (BGP) is the current inter-domain routing protocol used to exchange reachability information between Autonomous Systems (ASes) in the Internet. BGP supports policy-based routing which allows each AS to independently adopt a set of local policies that specify which routes it accepts and advertises from/to other networks, as well as which route it prefers when more than one route becomes available. However, independently chosen local policies may cause global conflicts, which result in protocol divergence. In this paper, we propose a new algorithm, called Adaptive Policy Management Scheme (APMS), to resolve policy conflicts in a distributed manner. Akin to distributed feedback control systems, each AS independently classifies the state of the network as either conflict-free or potentially-conflicting by observing its local history only (namely, route flaps). Based on the degree of measured conflicts (policy conflict-avoidance vs. -control mode), each AS dynamically adjusts its own path preferences—increasing its preference for observably stable paths over flapping paths. APMS also includes a mechanism to distinguish route flaps due to topology changes, so as not to confuse them with those due to policy conflicts. A correctness and convergence analysis of APMS based on the substability property of chosen paths is presented. Implementation in the SSF network simulator is performed, and simulation results for different performance metrics are presented. The metrics capture the dynamic performance (in terms of instantaneous throughput, delay, routing load, etc.) of APMS and other competing solutions, thus exposing the often neglected aspects of performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The climatic development of the Mid to Late Quaternary (last 400,000 years) is characterised by fluctuation between glacial and interglacial periods leading to the present interglacial, the Holocene. In comparison to preceding periods it was believed the Holocene represented a time of relative climatic stability. However, recent work has shown that the Holocene can be divided into cooler periods such as the Little Ice Age alternating with time intervals where climatic conditions ameliorated i.e. Medieval Warm Period, Holocene Thermal Optimum and the present Modern Optimum. In addition, the Holocene is recognised as a period with increasing anthropogenic influence on the environment. Onshore records recording glacial/interglacial cycles as well as anthropogenic effects are limited. However, sites of sediment accumulation on the shallow continental shelf offer the potential to reconstruct these events. Such sites include tunnel valleys and low energy, depositional settings. In this study we interrogated the sediment stratigraphy at such sites in the North Sea and Irish Sea using traditional techniques, as well as novel applications of geotechnical data, to reconstruct the palaeoenvironmental record. Within the German North Sea sector a combination of core, seismic and in-situ Cone Penetration Testing (CPT) data was used to identify sedimentary units, place them within a morphological context, relate them to glacial or interglacial periods stratigraphically, and correlate them across the German North Sea. Subsequently, we were able to revise the Mid to Late Quaternary stratigraphy for the North Sea using this new and novel data. Similarly, Holocene environmental changes were investigated within the Irish Sea at a depositional site with active anthropogenic influence. The methods used included analyses on grain-size distribution, foraminifera, gamma spectrometry, AMS 14C and physical core logging. The investigation revealed a strong fluctuating climatic signal early in the areas history before anthropogenic influence affects the record through trawling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Based on Pulay's direct inversion iterative subspace (DIIS) approach, we present a method to accelerate self-consistent field (SCF) convergence. In this method, the quadratic augmented Roothaan-Hall (ARH) energy function, proposed recently by Høst and co-workers [J. Chem. Phys. 129, 124106 (2008)], is used as the object of minimization for obtaining the linear coefficients of Fock matrices within DIIS. This differs from the traditional DIIS of Pulay, which uses an object function derived from the commutator of the density and Fock matrices. Our results show that the present algorithm, abbreviated ADIIS, is more robust and efficient than the energy-DIIS (EDIIS) approach. In particular, several examples demonstrate that the combination of ADIIS and DIIS ("ADIIS+DIIS") is highly reliable and efficient in accelerating SCF convergence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerical approximation of the long time behavior of a stochastic di.erential equation (SDE) is considered. Error estimates for time-averaging estimators are obtained and then used to show that the stationary behavior of the numerical method converges to that of the SDE. The error analysis is based on using an associated Poisson equation for the underlying SDE. The main advantages of this approach are its simplicity and universality. It works equally well for a range of explicit and implicit schemes, including those with simple simulation of random variables, and for hypoelliptic SDEs. To simplify the exposition, we consider only the case where the state space of the SDE is a torus, and we study only smooth test functions. However, we anticipate that the approach can be applied more widely. An analogy between our approach and Stein's method is indicated. Some practical implications of the results are discussed. Copyright © by SIAM. Unauthorized reproduction of this article is prohibited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Variations in the ratio of magnesium to calcium (Mg/Ca) in fossil ostracodes from Deep Sea Drilling Project Site 607 in the deep North Atlantic show that the change in bottom water temperature during late Pliocene 41,000-year obliquity cycles averaged 1.5°C between 3.2 and 2.8 million years ago (Ma) and increased to 2.3°C between 2.8 and 2.3 Ma, coincidentally with the intensification of Northern Hemisphere glaciation. During the last two 100,000-year glacial-to-interglacial climatic cycles of the Quaternary, bottom water temperatures changed by 4.5°C. These results show that glacial deepwater cooling has intensified since 3.2 Ma, most likely as the result of progressively diminished deep-water production in the North Atlantic and of the greater influence of Antarctic bottom water in the North Atlantic during glacial periods. The ostracode Mg/Ca data also allow the direct determination of the temperature component of the benthic foraminiferal oxygen isotope record from Site 607, as well as derivation of a hypothetical sea-level curve for the late Pliocene and late Quaternary. The effects of dissolution on the Mg/Ca ratios of ostracode shells appear to have been minimal.