928 resultados para Variational Convergence


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To investigate the influence of convergence on axial length and corneal topography in young adult subjects.---------- Methods: Fifteen emmetropic young adult subjects with normal binocular vision had axial length and corneal topography measured immediately before and after a 15-min period of base out (BO) prismatic spectacle lens wear. Two different magnitude prismatic spectacles were worn in turn (8 [DELTA] BO and 16 [DELTA] BO), and for both tasks, distance fixation was maintained for the duration of lens wear. Eight subjects returned on a separate day for further testing and had axial length measured before, during, and immediately after a 15-min convergence task.---------- Results: No significant change was found to occur in axial length either during or after the sustained convergence tasks (p > 0.6). Some small but significant changes in corneal topography were found to occur after sustained convergence. The most significant corneal change was observed after the 16 [DELTA] BO prism wear. The corneal refractive power spherocylinder power vector J0 was found to change by a small (mean change of 0.03 D after the 16 [DELTA] BO task) but statistically significant (p = 0.03) amount as a result of the convergence task (indicative of a reduction in with-the-rule corneal astigmatism after convergence). Corneal axial power was found to exhibit a significant flattening in superior regions. Conclusions: Axial length appears largely unchanged by a period of sustained convergence. However, small but significant changes occur in the topography of the cornea after convergence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rapidly developing information and telecommunication technologies and their platforms in the late 20th Century helped improve urban infrastructure management and influenced quality of life. Telecommunication technologies make it possible for people to deliver text, audio and video material using wired, wireless or fibre-optic networks. Technologies convergence amongst these digital devices continues to create new ways in which the information and telecommunication technologies are used. The 21st Century is an era where information has converged, in which people are able to access a variety of services, including internet and location based services, through multi-functional devices such as mobile phones. This chapter discusses the recent developments in telecommunication networks and trends in convergence technologies, their implications for urban infrastructure planning, and for the quality of life of urban residents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Efficient and effective urban management systems for Ubiquitous Eco Cities require having intelligent and integrated management mechanisms. This integration includes bringing together economic, socio-cultural and urban development with a well orchestrated, transparent and open decision-making system and necessary infrastructure and technologies. In Ubiquitous Eco Cities telecommunication technologies play an important role in monitoring and managing activities via wired and wireless networks. Particularly, technology convergence creates new ways in which information and telecommunication technologies are used and formed the backbone of urban management. The 21st Century is an era where information has converged, in which people are able to access a variety of services, including internet and location based services, through multi-functional devices and provides new opportunities in the management of Ubiquitous Eco Cities. This chapter discusses developments in telecommunication infrastructure and trends in convergence technologies and their implications on the management of Ubiquitous Eco Cities

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 'Queensland Model' grew out of three convergent agendas: educational renewal, urban redevelopment, and the Queensland state government's 'Smart State' strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An initialisation process is a key component in modern stream cipher design. A well-designed initialisation process should ensure that each key-IV pair generates a different key stream. In this paper, we analyse two ciphers, A5/1 and Mixer, for which this does not happen due to state convergence. We show how the state convergence problem occurs and estimate the effective key-space in each case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many of the classification algorithms developed in the machine learning literature, including the support vector machine and boosting, can be viewed as minimum contrast methods that minimize a convex surrogate of the 0–1 loss function. The convexity makes these algorithms computationally efficient. The use of a surrogate, however, has statistical consequences that must be balanced against the computational virtues of convexity. To study these issues, we provide a general quantitative relationship between the risk as assessed using the 0–1 loss and the risk as assessed using any nonnegative surrogate loss function. We show that this relationship gives nontrivial upper bounds on excess risk under the weakest possible condition on the loss function—that it satisfies a pointwise form of Fisher consistency for classification. The relationship is based on a simple variational transformation of the loss function that is easy to compute in many applications. We also present a refined version of this result in the case of low noise, and show that in this case, strictly convex loss functions lead to faster rates of convergence of the risk than would be implied by standard uniform convergence arguments. Finally, we present applications of our results to the estimation of convergence rates in function classes that are scaled convex hulls of a finite-dimensional base class, with a variety of commonly used loss functions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estimates of the half-life to convergence of prices across a panel of cities are subject to bias from three potential sources: inappropriate cross-sectional aggregation of heterogeneous coefficients, presence of lagged dependent variables in a model with individual fixed effects, and time aggregation of commodity prices. This paper finds no evidence of heterogeneity bias in annual CPI data for 17 U.S. cities from 1918 to 2006, but correcting for the “Nickell bias” and time aggregation bias produces a half-life of 7.5 years, shorter than estimates from previous studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis investigates profiling and differentiating customers through the use of statistical data mining techniques. The business application of our work centres on examining individuals’ seldomly studied yet critical consumption behaviour over an extensive time period within the context of the wireless telecommunication industry; consumption behaviour (as oppose to purchasing behaviour) is behaviour that has been performed so frequently that it become habitual and involves minimal intentions or decision making. Key variables investigated are the activity initialised timestamp and cell tower location as well as the activity type and usage quantity (e.g., voice call with duration in seconds); and the research focuses are on customers’ spatial and temporal usage behaviour. The main methodological emphasis is on the development of clustering models based on Gaussian mixture models (GMMs) which are fitted with the use of the recently developed variational Bayesian (VB) method. VB is an efficient deterministic alternative to the popular but computationally demandingMarkov chainMonte Carlo (MCMC) methods. The standard VBGMMalgorithm is extended by allowing component splitting such that it is robust to initial parameter choices and can automatically and efficiently determine the number of components. The new algorithm we propose allows more effective modelling of individuals’ highly heterogeneous and spiky spatial usage behaviour, or more generally human mobility patterns; the term spiky describes data patterns with large areas of low probability mixed with small areas of high probability. Customers are then characterised and segmented based on the fitted GMM which corresponds to how each of them uses the products/services spatially in their daily lives; this is essentially their likely lifestyle and occupational traits. Other significant research contributions include fitting GMMs using VB to circular data i.e., the temporal usage behaviour, and developing clustering algorithms suitable for high dimensional data based on the use of VB-GMM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 21" century business environment is dominated by unprecedented change across a broad spectrum of social, economic, technological and cultural factors (Nowotny, Scott & Gibbons 2001). Among these, two broad trends -economic globalisation and rising knowledge intensity (Hart 2006)have come to distinguish organisational life. Under the weight of these transformational influences, the developed world, it seems, has arrived at a transformational moment. The far-reaching effects of the global financial crisis and its shadowy twin: the threat of a double dip recession, continue to exert an unsteadying influence on global and corporate finances. Growth in developed economies has slumped, share prices have declined, the market value of corporations has slipped and unemployment rates, in the vast majority of developed economies, have risen. Gross domestic product (GDP) growth has retreated from the strong growth experienced in the late 1990s to negative growth in 2009 and a sluggish and unsteady recovery in 2010. In response, the reach of Government in terms of its participation in markets has been extended, bringing with it the need to transition to new governance and regulatory arrangements. Ongoing concerns regarding the pace and sustainability of the recovery remains a front-of-mind concern with bailouts, buybacks, borrowings and BP dominating news services: 'We are witnessing the reweaving of the social, political and economic fabric that binds our planet, with long-term consequences that are as or more profound than those of the industrial era' (Tapscott & Williams 2006, p. 59).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Various time-memory tradeoffs attacks for stream ciphers have been proposed over the years. However, the claimed success of these attacks assumes the initialisation process of the stream cipher is one-to-one. Some stream cipher proposals do not have a one-to-one initialisation process. In this paper, we examine the impact of this on the success of time-memory-data tradeoff attacks. Under the circumstances, some attacks are more successful than previously claimed while others are less. The conditions for both cases are established.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As an international norm, the Responsibility to Protect (R2P) has gained substantial influence and institutional presence—and created no small controversy—in the ten years since its first conceptualisation. Conversely, the Protection of Civilians in Armed Conflict (PoC) has a longer pedigree and enjoys a less contested reputation. Yet UN Security Council action in Libya in 2011 has thrown into sharp relief the relationship between the two. UN Security Council Resolutions 1970 and 1973 follow exactly the process envisaged by R2P in response to imminent atrocity crimes, yet the operative paragraphs of the resolutions themselves invoke only PoC. This article argues that, while the agendas of PoC and R2P converge with respect to Security Council action in cases like Libya, outside this narrow context it is important to keep the two norms distinct. Peacekeepers, humanitarian actors, international lawyers, individual states and regional organisations are required to act differently with respect to the separate agendas and contexts covered by R2P and PoC. While overlap between the two does occur in highly visible cases like Libya, neither R2P nor PoC collapses normatively, institutionally or operationally into the other.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sfinks is a shift register based stream cipher designed for hardware implementation. The initialisation state update function is different from the state update function used for keystream generation. We demonstrate state convergence during the initialisation process, even though the individual components used in the initialisation are one-to-one. However, the combination of these components is not one-to-one.