968 resultados para Order of Convergence


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Investigations of piston cores from the Vema Channel and lower flanks of the Rio Grande Rise suggest the presence of episodic flow of deep and bottom water during the Late Pleistocene. Cores from below the present-day foraminiferal lysocline (at ~4000 m) contain an incomplete depositional record consisting of Mn nodules and encrustations, hemipelagic clay, displaced high-latitude diatoms, and poorly preserved heterogeneous microfossil assemblages. Cores from the depth range between 2900 m and 4000 m contain an essentially complete Late Pleistocene record, and consist of well-defined carbonate dissolution cycles with periodicities of ~100,000 years. Low carbonate content and increased dissolution correspond to glacial episodes, as interpreted by oxygen isotopic analysis of bulk foraminiferal assemblages. The absence of diagnostic high-latitude indicators (Antarctic diatoms) within the dissolution cyclss, however, suggests that AABW may not have extended to significantly shallower elevations on the lower flanks of the Rio Grande Rise during the Late Pleistocene. Therefore episodic AABW flow may not necessarily be the mechanism responsible for producing these cyclic events. This interpretation is also supported by the presence of an apparently complete Brunhes depositional record in the same cores, suggesting current velocities insufficient for significant erosion. Fluctuations in the properties and flow characteristics of another water mass, such as NADW, may be involved. The geologic evidence in core-top samples near the present-day AABW/NADW transition zone is consistent with either of two possible interpretations of the upper limit of AABW on the east flank of the channel. The foraminiferal lysocline, at ~4000 m, is near the top of the benthic thermocline and nepheloid layer, and may therefore correspond to the upper limit of relatively corrosive AABW. On the other hand, the carbonate compensation depth (CDD) at ~4250 m, which corresponds to the maximum gradient in the benthic thermocline, is characterized by rapid deposition of relatively fine-grained sediment. Such a zone of convergence and preferential sediment accumulation would be expected near the level of no motion in the AABW/NADW transition zone as a consequence of Ekman-layer veering of the mean velocity vector in the bottom boundary layer. It is possible that both of these interpretations are in part correct. The "level of no motion'' may in fact correspond to the CCD, while at the same time relatively corrosive water of Antarctic origin may mix with overlying NADW and therefore elevate the foraminifera] lysocline to depths above the level of no motion. Closely spaced observations of the hydrography and flow characteristics within the benthic thermocline will be required in order to use sediment parameters as more precise indicators of paleo-circulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Light rainfall is the baseline input to the annual water budget in mountainous landscapes through the tropics and at mid-latitudes. In the Southern Appalachians, the contribution from light rainfall ranges from 50-60% during wet years to 80-90% during dry years, with convective activity and tropical cyclone input providing most of the interannual variability. The Southern Appalachians is a region characterized by rich biodiversity that is vulnerable to land use/land cover changes due to its proximity to a rapidly growing population. Persistent near surface moisture and associated microclimates observed in this region has been well documented since the colonization of the area in terms of species health, fire frequency, and overall biodiversity. The overarching objective of this research is to elucidate the microphysics of light rainfall and the dynamics of low level moisture in the inner region of the Southern Appalachians during the warm season, with a focus on orographically mediated processes. The overarching research hypothesis is that physical processes leading to and governing the life cycle of orographic fog, low level clouds, and precipitation, and their interactions, are strongly tied to landform, land cover, and the diurnal cycles of flow patterns, radiative forcing, and surface fluxes at the ridge-valley scale. The following science questions will be addressed specifically: 1) How do orographic clouds and fog affect the hydrometeorological regime from event to annual scale and as a function of terrain characteristics and land cover?; 2) What are the source areas, governing processes, and relevant time-scales of near surface moisture convergence patterns in the region?; and 3) What are the four dimensional microphysical and dynamical characteristics, including variability and controlling factors and processes, of fog and light rainfall? The research was conducted with two major components: 1) ground-based high-quality observations using multi-sensor platforms and 2) interpretive numerical modeling guided by the analysis of the in situ data collection. Findings illuminate a high level of spatial – down to the ridge scale - and temporal – from event to annual scale - heterogeneity in observations, and a significant impact on the hydrological regime as a result of seeder-feeder interactions among fog, low level clouds, and stratiform rainfall that enhance coalescence efficiency and lead to significantly higher rainfall rates at the land surface. Specifically, results show that enhancement of an event up to one order of magnitude in short-term accumulation can occur as a result of concurrent fog presence. Results also show that events are modulated strongly by terrain characteristics including elevation, slope, geometry, and land cover. These factors produce interactions between highly localized flows and gradients of temperature and moisture with larger scale circulations. Resulting observations of DSD and rainfall patterns are stratified by region and altitude and exhibit clear diurnal and seasonal cycles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The delta18O values of planktonic foraminifera increased in the Caribbean by about 0.5? relative to the equatorial East Pacific values between 4.6 and 4.2 Ma as a consequence of the closure of the Central American Gateway (CAG). This increase in delta18O can be interpreted either as an increase in Caribbean sea surface (mixed layer) salinity (SSS) or as a decrease in sea surface temperatures (SST). This problem represents an ideal situation to apply the recently developed paleotemperature proxy delta44/40Ca together with Mg/Ca and d18O on the planktic foraminifer Globigerinoides sacculifer from ODP Site 999. Although differences in absolute temperature calibration of delta44/40Ca and Mg/Ca exist, the general pattern is similar indicating a SST decrease of about 2-3 8C between 4.4 and 4.3 Ma followed by an increase in the same order of magnitude between 4.3 and 4.0 Ma. Correcting the delta18O record for this temperature change and assuming that changes in global ice volume are negligible, the salinity-induced planktonic delta18O signal decreased by about 0.4? between 4.4 and 4.3 Ma and increased by about 0.9? between 4.3 and 4.0 Ma in the Caribbean. The observed temperature and salinity trends are interpreted to reflect the restricted exchange of surface water between the Caribbean and the Pacific in response to the shoaling of the Panamanian Seaway, possibly accompanied by a southward shift of the Intertropical Convergence Zone (ITCZ) between 4.4 and 4.3 Ma. Differences in Mg/Ca- and delta44/40Ca-derived temperatures can be reconciled by corrections for secular variations of the marine Mg/Ca[sw] and delta44/40Ca, a salinity effect on the Mg/Ca ratio and a constant temperature offset of ~2.5 °C between both SST proxy calibrations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Preparedness has become a central component to contemporary approaches to flood risk management as there is a growing recognition that our reliance on engineered flood defences is unsustainable within the context of more extreme and unpredictable weather events. Whilst many researchers have focused their attention on exploring the key factors influencing flood-risk preparedness at the individual level, little consideration has been attributed to how we understand preparedness conceptually and practically in the first instance. This paper seeks to address this particular gap by identifying and analysing the diverse range of conceptualisations of preparedness and typologies of preparedness measures that exist within the literature in order to identify areas of convergence and divergence. In doing so, we demonstrate that a considerable degree of confusion remains in terms of how preparedness is defined, conceptualised and categorised. We conclude by reflecting on the implications this has from an academic perspective, but also in terms of the more practical aspects of flood risk management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Following the intrinsically linked balance sheets in his Capital Formation Life Cycle, Lukas M. Stahl explains with his Triple A Model of Accounting, Allocation and Accountability the stages of the Capital Formation process from FIAT to EXIT. Based on the theoretical foundations of legal risk laid by the International Bar Association with the help of Roger McCormick and legal scholars such as Joanna Benjamin, Matthew Whalley and Tobias Mahler, and founded on the basis of Wesley Hohfeld’s category theory of jural relations, Stahl develops his mutually exclusive Four Determinants of Legal Risk of Law, Lack of Right, Liability and Limitation. Those Four Determinants of Legal Risk allow us to apply, assess, and precisely describe the respective legal risk at all stages of the Capital Formation Life Cycle as demonstrated in case studies of nine industry verticals of the proposed and currently negotiated Transatlantic Trade and Investment Partnership between the United States of America and the European Union, TTIP, as well as in the case of the often cited financing relation between the United States and the People’s Republic of China. Having established the Four Determinants of Legal Risk and its application to the Capital Formation Life Cycle, Stahl then explores the theoretical foundations of capital formation, their historical basis in classical and neo-classical economics and its forefathers such as The Austrians around Eugen von Boehm-Bawerk, Ludwig von Mises and Friedrich von Hayek and most notably and controversial, Karl Marx, and their impact on today’s exponential expansion of capital formation. Starting off with the first pillar of his Triple A Model, Accounting, Stahl then moves on to explain the Three Factors of Capital Formation, Man, Machines and Money and shows how “value-added” is created with respect to the non-monetary capital factors of human resources and industrial production. Followed by a detailed analysis discussing the roles of the Three Actors of Monetary Capital Formation, Central Banks, Commercial Banks and Citizens Stahl readily dismisses a number of myths regarding the creation of money providing in-depth insight into the workings of monetary policy makers, their institutions and ultimate beneficiaries, the corporate and consumer citizens. In his second pillar, Allocation, Stahl continues his analysis of the balance sheets of the Capital Formation Life Cycle by discussing the role of The Five Key Accounts of Monetary Capital Formation, the Sovereign, Financial, Corporate, Private and International account of Monetary Capital Formation and the associated legal risks in the allocation of capital pursuant to his Four Determinants of Legal Risk. In his third pillar, Accountability, Stahl discusses the ever recurring Crisis-Reaction-Acceleration-Sequence-History, in short: CRASH, since the beginning of the millennium starting with the dot-com crash at the turn of the millennium, followed seven years later by the financial crisis of 2008 and the dislocations in the global economy we are facing another seven years later today in 2015 with several sordid debt restructurings under way and hundred thousands of refugees on the way caused by war and increasing inequality. Together with the regulatory reactions they have caused in the form of so-called landmark legislation such as the Sarbanes-Oxley Act of 2002, the Dodd-Frank Act of 2010, the JOBS Act of 2012 or the introduction of the Basel Accords, Basel II in 2004 and III in 2010, the European Financial Stability Facility of 2010, the European Stability Mechanism of 2012 and the European Banking Union of 2013, Stahl analyses the acceleration in size and scope of crises that appears to find often seemingly helpless bureaucratic responses, the inherent legal risks and the complete lack of accountability on part of those responsible. Stahl argues that the order of the day requires to address the root cause of the problems in the form of two fundamental design defects of our Global Economic Order, namely our monetary and judicial order. Inspired by a 1933 plan of nine University of Chicago economists abolishing the fractional reserve system, he proposes the introduction of Sovereign Money as a prerequisite to void misallocations by way of judicial order in the course of domestic and transnational insolvency proceedings including the restructuring of sovereign debt throughout the entire monetary system back to its origin without causing domino effects of banking collapses and failed financial institutions. In recognizing Austrian-American economist Schumpeter’s Concept of Creative Destruction, as a process of industrial mutation that incessantly revolutionizes the economic structure from within, incessantly destroying the old one, incessantly creating a new one, Stahl responds to Schumpeter’s economic chemotherapy with his Concept of Equitable Default mimicking an immunotherapy that strengthens the corpus economicus own immune system by providing for the judicial authority to terminate precisely those misallocations that have proven malignant causing default perusing the century old common law concept of equity that allows for the equitable reformation, rescission or restitution of contract by way of judicial order. Following a review of the proposed mechanisms of transnational dispute resolution and current court systems with transnational jurisdiction, Stahl advocates as a first step in order to complete the Capital Formation Life Cycle from FIAT, the creation of money by way of credit, to EXIT, the termination of money by way of judicial order, the institution of a Transatlantic Trade and Investment Court constituted by a panel of judges from the U.S. Court of International Trade and the European Court of Justice by following the model of the EFTA Court of the European Free Trade Association. Since the first time his proposal has been made public in June of 2014 after being discussed in academic circles since 2011, his or similar proposals have found numerous public supporters. Most notably, the former Vice President of the European Parliament, David Martin, has tabled an amendment in June 2015 in the course of the negotiations on TTIP calling for an independent judicial body and the Member of the European Commission, Cecilia Malmström, has presented her proposal of an International Investment Court on September 16, 2015. Stahl concludes, that for the first time in the history of our generation it appears that there is a real opportunity for reform of our Global Economic Order by curing the two fundamental design defects of our monetary order and judicial order with the abolition of the fractional reserve system and the introduction of Sovereign Money and the institution of a democratically elected Transatlantic Trade and Investment Court that commensurate with its jurisdiction extending to cases concerning the Transatlantic Trade and Investment Partnership may complete the Capital Formation Life Cycle resolving cases of default with the transnational judicial authority for terminal resolution of misallocations in a New Global Economic Order without the ensuing dangers of systemic collapse from FIAT to EXIT.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction - No validated protocol exists for the measurement of the prism fusion ranges. Many studies report on how fusional vergence ranges can be measured using different techniques (rotary prism, prism bar, loose prisms and synoptophore) and stimuli, leading to different ranges being reported in the literature. Repeatability of the different methods available and the equivalence between them it is also important. In addition, some studies available do not agree in what order fusional vergence should be measured to provide the essential information on which to base clinical judgements on compensation of deviations. When performing fusional vergence testing the most commonly accepted clinical technique is to first measure negative fusional vergence followed by a measurement of positive fusional vergence to avoid affecting the value of vergence recovery because of excessive stimulation of convergence. Von Noorden recommend using vertical fusion amplitudes in between horizontal amplitudes (base-out, base-up, base-in, and base down) to prevent vergence adaptation. Others place the base of the prism in the direction opposite to that used to measure the deviation to increase the vergence demand. Objectives - The purpose of this review is to assess and compare the accuracy of tests for measurement of fusional vergence. Secondary objectives are to investigate sources of heterogeneity of diagnostic accuracy including: age; variation in method of assessment; study design; study size; type of strabismus (convergent, divergent, vertical, cycle); severity of strabismus (constant/intermittent/latent).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis proves certain results concerning an important question in non-equilibrium quantum statistical mechanics which is the derivation of effective evolution equations approximating the dynamics of a system of large number of bosons initially at equilibrium (ground state at very low temperatures). The dynamics of such systems are governed by the time-dependent linear many-body Schroedinger equation from which it is typically difficult to extract useful information due to the number of particles being large. We will study quantitatively (i.e. with explicit bounds on the error) how a suitable one particle non-linear Schroedinger equation arises in the mean field limit as number of particles N → ∞ and how the appropriate corrections to the mean field will provide better approximations of the exact dynamics. In the first part of this thesis we consider the evolution of N bosons, where N is large, with two-body interactions of the form N³ᵝv(Nᵝ⋅), 0≤β≤1. The parameter β measures the strength and the range of interactions. We compare the exact evolution with an approximation which considers the evolution of a mean field coupled with an appropriate description of pair excitations, see [18,19] by Grillakis-Machedon-Margetis. We extend the results for 0 ≤ β < 1/3 in [19, 20] to the case of β < 1/2 and obtain an error bound of the form p(t)/Nᵅ, where α>0 and p(t) is a polynomial, which implies a specific rate of convergence as N → ∞. In the second part, utilizing estimates of the type discussed in the first part, we compare the exact evolution with the mean field approximation in the sense of marginals. We prove that the exact evolution is close to the approximate in trace norm for times of the order o(1)√N compared to log(o(1)N) as obtained in Chen-Lee-Schlein [6] for the Hartree evolution. Estimates of similar type are obtained for stronger interactions as well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human resources are an essential element in territorial development. When these are characterized by a high level of training, they also enhance a number of effects in fundamental areas of binomial territorial-social cohesion. In this respect, the existence of higher education institutions throughout the territory allows the spread of human resources’ qualification but, by itself, does not guarantee the retention of these resources in different regions. Thus, the objective of this paper is to undertake a spatial analysis of convergence of knowledge through studying the evolution of the percentage of population with higher education in the periods elapsed between the last three censuses in Portugal. Although that percentage has risen appreciably, the convergence is shown to be (very) insignificant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We address the question of the rates of convergence of the p-version interior penalty discontinuous Galerkin method (p-IPDG) for second order elliptic problems with non-homogeneous Dirichlet boundary conditions. It is known that the p-IPDG method admits slightly suboptimal a-priori bounds with respect to the polynomial degree (in the Hilbertian Sobolev space setting). An example for which the suboptimal rate of convergence with respect to the polynomial degree is both proven theoretically and validated in practice through numerical experiments is presented. Moreover, the performance of p- IPDG on the related problem of p-approximation of corner singularities is assessed both theoretically and numerically, witnessing an almost doubling of the convergence rate of the p-IPDG method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Établir une régulation de l’économie numérique au Sénégal représente un enjeu fondamental pour les gouvernants et l’ensemble des acteurs qui la compose. Suivant une démarche plus globalisée, d’énormes mutations normatives visant les rationalités et les mécanismes de réglementations ont évolué dans le temps donnant une place plus considérable au droit dans les politiques publiques des États. Différents modèles normatifs et institutionnels sont ainsi adaptés pour prendre en charge le phénomène de la convergence dépendamment du contexte réglementaire du pays. Pour ce qui est du contexte actuel du Sénégal, l’étanchéité des réglementations relatives aux télécommunications et à l’audiovisuel, désormais convergent, est fondée sur un modèle de réglementation sectorielle. Toutefois, leur convergence a provoqué un brouillage des frontières qui risque désormais de poser des conséquences énormes sur le plan normatif tel que des risques d’enchevêtrement sur le plan institutionnel ou réglementaire. Or au plan national, il n’existe à ce jour aucun texte visant à assoir les bases d’une régulation convergente. Ainsi, à la question de savoir si la régulation sectorielle est pertinente au regard de l’environnement du numérique marqué par la convergence, il s’est avéré qu’elle pourrait être adoptée comme modèle à court terme. Mais dans un but de réaliser des économies d’échelle pour réguler efficacement les différents secteurs et industries infrastructurelles, il faut un modèle de régulation unique marquée par la fusion de l’ARTP et du CNRA. D’une part, la régulation sectorielle permet d’accompagner la transition vers le numérique déjà lancée et d’autre part la régulation multisectorielle servira une fois la convergence des marchés établis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

My dissertation emphasizes the use of narrative structuralism and narrative theories about storytelling in order to build a discourse between the fields of New Media and Rhetoric and Composition. Propp's morphological analysis and the breaking down of stories into component pieces aides in the discussion of storytelling as it appears in and is mediated by digital and computer technologies. New Media and Rhetoric and Composition are aided by shared concerns for textual production and consumption. In using the notion of "kairotic reading" (KR), I show the interconnectedness and interdisciplinarity required in the development of pedagogy utilized to teach students to develop into reflective practitioners that are aware of their rhetorical surroundings and can made sound judgments concerning their own message generation and consumption in the workplace. KR is a transferable skill that is beneficial to students and teachers alike. The dissertation research utilizes theories of New Media and New Media-influenced practitioners, including Jenkins' theory of convergence, Bourdieu's notion of taste, Gee's term "semiotic domains," and Manovich's "modification." These theoretical pieces are combined in order to show how KR can be extended by convergent narrative practices. In order to build connections with New Media, the consideration and inclusion of Kress and van Leeuwen's multimodality, Selber's "reflective practitioners," and Selfe's definition of multimodal composing allow for a greater establishment of conversation order to create a richer conversation around the implications of metacognitive development and practitioner reflexivity with scholars in New Media. My research also includes analysis of two popular media franchises Deborah Harkness' A Discovery of Witches and Fox's Bones television series to show similarities and differences among convergence-linked and multimodal narratives. Lastly, I also provide example assignments that can be taken, further developed, and utilized in classrooms engaging in multimodal composing practices. This dissertation pushes consideration of New Media into the work already being performed by those in Rhetoric and Composition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Établir une régulation de l’économie numérique au Sénégal représente un enjeu fondamental pour les gouvernants et l’ensemble des acteurs qui la compose. Suivant une démarche plus globalisée, d’énormes mutations normatives visant les rationalités et les mécanismes de réglementations ont évolué dans le temps donnant une place plus considérable au droit dans les politiques publiques des États. Différents modèles normatifs et institutionnels sont ainsi adaptés pour prendre en charge le phénomène de la convergence dépendamment du contexte réglementaire du pays. Pour ce qui est du contexte actuel du Sénégal, l’étanchéité des réglementations relatives aux télécommunications et à l’audiovisuel, désormais convergent, est fondée sur un modèle de réglementation sectorielle. Toutefois, leur convergence a provoqué un brouillage des frontières qui risque désormais de poser des conséquences énormes sur le plan normatif tel que des risques d’enchevêtrement sur le plan institutionnel ou réglementaire. Or au plan national, il n’existe à ce jour aucun texte visant à assoir les bases d’une régulation convergente. Ainsi, à la question de savoir si la régulation sectorielle est pertinente au regard de l’environnement du numérique marqué par la convergence, il s’est avéré qu’elle pourrait être adoptée comme modèle à court terme. Mais dans un but de réaliser des économies d’échelle pour réguler efficacement les différents secteurs et industries infrastructurelles, il faut un modèle de régulation unique marquée par la fusion de l’ARTP et du CNRA. D’une part, la régulation sectorielle permet d’accompagner la transition vers le numérique déjà lancée et d’autre part la régulation multisectorielle servira une fois la convergence des marchés établis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present some estimates of the time of convergence to the equilibrium distribution in autonomous and periodic non-autonomous graphs, with ergodic stochastic adjacency matrices, using the eigenvalues of these matrices. On this way we generalize previous results from several authors, that only considered reversible matrices.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Assessment and prediction of the impact of vehicular traffic emissions on air quality and exposure levels requires knowledge of vehicle emission factors. The aim of this study was quantification of emission factors from an on road, over twelve months measurement program conducted at two sites in Brisbane: 1) freeway type (free flowing traffic at about 100 km/h, fleet dominated by small passenger cars - Tora St); and 2) urban busy road with stop/start traffic mode, fleet comprising a significant fraction of heavy duty vehicles - Ipswich Rd. A physical model linking concentrations measured at the road for specific meteorological conditions with motor vehicle emission factors was applied for data analyses. The focus of the study was on submicrometer particles; however the measurements also included supermicrometer particles, PM2.5, carbon monoxide, sulfur dioxide, oxides of nitrogen. The results of the study are summarised in this paper. In particular, the emission factors for submicrometer particles were 6.08 x 1013 and 5.15 x 1013 particles per vehicle-1 km-1 for Tora St and Ipswich Rd respectively and for supermicrometer particles for Tora St, 1.48 x 109 particles per vehicle-1 km-1. Emission factors of diesel vehicles at both sites were about an order of magnitude higher than emissions from gasoline powered vehicles. For submicrometer particles and gasoline vehicles the emission factors were 6.08 x 1013 and 4.34 x 1013 particles per vehicle-1 km-1 for Tora St and Ipswich Rd, respectively, and for diesel vehicles were 5.35 x 1014 and 2.03 x 1014 particles per vehicle-1 km-1 for Tora St and Ipswich Rd, respectively. For supermicrometer particles at Tora St the emission factors were 2.59 x 109 and 1.53 x 1012 particles per vehicle-1 km-1, for gasoline and diesel vehicles, respectively.