924 resultados para Convergence Au Sens De Mosco
Resumo:
The performance of an adaptive filter may be studied through the behaviour of the optimal and adaptive coefficients in a given environment. This thesis investigates the performance of finite impulse response adaptive lattice filters for two classes of input signals: (a) frequency modulated signals with polynomial phases of order p in complex Gaussian white noise (as nonstationary signals), and (b) the impulsive autoregressive processes with alpha-stable distributions (as non-Gaussian signals). Initially, an overview is given for linear prediction and adaptive filtering. The convergence and tracking properties of the stochastic gradient algorithms are discussed for stationary and nonstationary input signals. It is explained that the stochastic gradient lattice algorithm has many advantages over the least-mean square algorithm. Some of these advantages are having a modular structure, easy-guaranteed stability, less sensitivity to the eigenvalue spread of the input autocorrelation matrix, and easy quantization of filter coefficients (normally called reflection coefficients). We then characterize the performance of the stochastic gradient lattice algorithm for the frequency modulated signals through the optimal and adaptive lattice reflection coefficients. This is a difficult task due to the nonlinear dependence of the adaptive reflection coefficients on the preceding stages and the input signal. To ease the derivations, we assume that reflection coefficients of each stage are independent of the inputs to that stage. Then the optimal lattice filter is derived for the frequency modulated signals. This is performed by computing the optimal values of residual errors, reflection coefficients, and recovery errors. Next, we show the tracking behaviour of adaptive reflection coefficients for frequency modulated signals. This is carried out by computing the tracking model of these coefficients for the stochastic gradient lattice algorithm in average. The second-order convergence of the adaptive coefficients is investigated by modeling the theoretical asymptotic variance of the gradient noise at each stage. The accuracy of the analytical results is verified by computer simulations. Using the previous analytical results, we show a new property, the polynomial order reducing property of adaptive lattice filters. This property may be used to reduce the order of the polynomial phase of input frequency modulated signals. Considering two examples, we show how this property may be used in processing frequency modulated signals. In the first example, a detection procedure in carried out on a frequency modulated signal with a second-order polynomial phase in complex Gaussian white noise. We showed that using this technique a better probability of detection is obtained for the reduced-order phase signals compared to that of the traditional energy detector. Also, it is empirically shown that the distribution of the gradient noise in the first adaptive reflection coefficients approximates the Gaussian law. In the second example, the instantaneous frequency of the same observed signal is estimated. We show that by using this technique a lower mean square error is achieved for the estimated frequencies at high signal-to-noise ratios in comparison to that of the adaptive line enhancer. The performance of adaptive lattice filters is then investigated for the second type of input signals, i.e., impulsive autoregressive processes with alpha-stable distributions . The concept of alpha-stable distributions is first introduced. We discuss that the stochastic gradient algorithm which performs desirable results for finite variance input signals (like frequency modulated signals in noise) does not perform a fast convergence for infinite variance stable processes (due to using the minimum mean-square error criterion). To deal with such problems, the concept of minimum dispersion criterion, fractional lower order moments, and recently-developed algorithms for stable processes are introduced. We then study the possibility of using the lattice structure for impulsive stable processes. Accordingly, two new algorithms including the least-mean P-norm lattice algorithm and its normalized version are proposed for lattice filters based on the fractional lower order moments. Simulation results show that using the proposed algorithms, faster convergence speeds are achieved for parameters estimation of autoregressive stable processes with low to moderate degrees of impulsiveness in comparison to many other algorithms. Also, we discuss the effect of impulsiveness of stable processes on generating some misalignment between the estimated parameters and the true values. Due to the infinite variance of stable processes, the performance of the proposed algorithms is only investigated using extensive computer simulations.
Resumo:
The stylized facts that motivate this thesis include the diversity in growth patterns that are observed across countries during the process of economic development, and the divergence over time in income distributions both within and across countries. This thesis constructs a dynamic general equilibrium model in which technology adoption is costly and agents are heterogeneous in their initial holdings of resources. Given the households‟ resource level, this study examines how adoption costs influence the evolution of household income over time and the timing of transition to more productive technologies. The analytical results of the model constructed here characterize three growth outcomes associated with the technology adoption process depending on productivity differences between the technologies. These are appropriately labeled as „poverty trap‟, „dual economy‟ and „balanced growth‟. The model is then capable of explaining the observed diversity in growth patterns across countries, as well as divergence of incomes over time. Numerical simulations of the model furthermore illustrate features of this transition. They suggest that that differences in adoption costs account for the timing of households‟ decision to switch technology which leads to a disparity in incomes across households in the technology adoption process. Since this determines the timing of complete adoption of the technology within a country, the implications for cross-country income differences are obvious. Moreover, the timing of technology adoption appears to be impacts on patterns of growth of households, which are different across various income groups. The findings also show that, in the presence of costs associated with the adoption of more productive technologies, inequalities of income and wealth may increase over time tending to delay the convergence in income levels. Initial levels of inequalities in the resources also have an impact on the date of complete adoption of more productive technologies. The issue of increasing income inequality in the process of technology adoption opens up another direction for research. Specifically increasing inequality implies that distributive conflicts may emerge during the transitional process with political- economy consequences. The model is therefore extended to include such issues. Without any political considerations, taxes would leads to a reduction in inequality and convergence of incomes across agents. However this process is delayed if politico-economic influences are taken into account. Moreover, the political outcome is sub optimal. This is essentially due to the fact that there is a resistance associated with the complete adoption of the advanced technology.
Resumo:
The overarching aim of this study is to create new knowledge about how playful interactions (re)create the city via ubiquitous technologies, with an outlook to apply the knowledge for pragmatic innovations in relevant fields such as urban planning and technology development in the future. The study looks at the case of transyouth, the in-between demographic bridging youth and adulthood in Seoul, one of the most connected, densely populated, and quickly transforming metropolises in the world. To unravel the elusiveness of ‘play’ as a subject and the complexity of urban networks, this study takes a three-tier transdisciplinary approach comprised of an extensive literature review, Shared Visual Ethnography (SVE), and interviews with leading industry representatives who design and develop the playscape for Seoul transyouth. Through these methodological tools, the study responds to the following four research aims: 1. Examine the sociocultural, technological, and architectural context of Seoul 2. Investigate Seoul transyouth’s perception of the self and their technosocial environment 3. Identify the pattern of their playful interaction through which meanings of the self and the city are recreated 4. Develop an analytical framework for enactment of play This thesis argues that the city is a contested space that continuously changes through multiple interactions among its constituents on the seam of control and freedom. At the core of this interactive (re)creation process is play. Play is a phenomenon that is enacted at the centre of three inter-related elements of pressure, possibility, and pleasure, the analytical framework this thesis puts forward as a conceptual apparatus for studying play across disciplines. The thesis concludes by illustrating possible trajectories for pragmatic application of the framework for envisioning and building the creative, sustainable, and seductive city.
Resumo:
At the centre of this research is an ethnographic study that saw the researcher embedded within the fabric of inner city life to better understand what characteristics of user activity and interaction could be enhanced by technology. The initial research indicated that the experience of traversing the city after dark unified an otherwise divergent user group through a shared concern for personal safety. Managing this fear and danger represented an important user need. We found that mobile social networking systems are not only integral for bringing people together, they can help in the process of users safely dispersing as well. We conclude, however, that at a time when the average iPhone staggers under the weight of a plethora of apps that do everything from acting as a carpenter’s level to a pregnancy predictor, we consider the potential for the functionality of a personal safety device to be embodied within a stand alone artifact.
Resumo:
A high performance, low computational complexity rate-based flow control algorithm which can avoid congestion and achieve fairness is important to ATM available bit rate service. The explicit rate allocation algorithm proposed by Kalampoukas et al. is designed to achieve max–min fairness in ATM networks. It has several attractive features, such as a fixed computational complexity of O(1) and the guaranteed convergence to max–min fairness. In this paper, certain drawbacks of the algorithm, such as the severe overload of an outgoing link during transient period and the non-conforming use of the current cell rate field in a resource management cell, have been identified and analysed; a new algorithm which overcomes these drawbacks is proposed. The proposed algorithm simplifies the rate computation as well. Compared with Kalampoukas's algorithm, it has better performance in terms of congestion avoidance and smoothness of rate allocation.
Resumo:
The processes of digitization and deregulation have transformed the production, distribution and consumption of information and entertainment media over the past three decades. Today, researchers are confronted with profoundly different landscapes of domestic and personal media than the pioneers of qualitative audience research that came to form much of the conceptual basis of Cultural Studies first in Britain and North America and subsequently across all global regions. The process of media convergence, as a consequence of the dual forces of digitisation and deregulation, thus constitutes a central concept in the analysis of popular mass media. From the study of the internationalisation and globalisation of media content, changing regimes of media production, via the social shaping and communication technologies and conversely the impact of communication technology on social, cultural and political realities, to the emergence of transmedia storytelling, the interplay of intertextuality and genre and the formation of mediated social networks, convergence informs and shapes contemporary conceptual debates in the field of popular communication and beyond. However, media convergence challenges not only the conceptual canon of (popular) communication research, but poses profound methodological challenges. As boundaries between producers and consumers are increasingly fluent, formerly stable fields and categories of research such as industries, texts and audiences intersect and overlap, requiring combined and new research strategies. This preconference aims to offer a forum to present and discuss methodological innovations in the study of contemporary media and the analysis of the social, cultural,and political impact and challenges arising through media convergence. The preconference thus aims to focus on the following methodological questions and challenges: *New strategies of audience research responding to the increasing individualisation of popular media consumption. *Methods of data triangulation in and through the integrated study of media production, distribution and consumption. *Bridging the methodological and often associated conceptual gap between qualitative and quantitative research in the study of popular media. *The future of ethnographic audience and production research in light of blurring boundaries between media producers and consumers. *A critical re-examination of which textual configurations can be meaningfully described and studied as text. *Methodological innovations aimed at assessing the macro social, cultural and political impact of mediatization (including, but not limited to, "creative methods"). *Methodological responses to the globalisation of popular media and practicalities of international and transnational comparative research. *An exploration of new methods required in the study of media flow and intertextuality.
Resumo:
Purpose–The aims of this paper are to demonstrate the application of Sen’s theory of well-being, the capability approach; to conceptualise the state of transportation disadvantage; and to underpin a theoretical sounds indicator selection process. Design/methodology/approach–This paper reviews and examines various measurement approaches of transportation disadvantage in order to select indicators and develop an innovative framework of urban transportation disadvantage. Originality/value–The paper provides further understanding of the state of transportation disadvantage from the capability approach perspective. In addition, building from this understanding, a validated and systematic framework is developed to select relevant indicators. Practical implications –The multi-indicator approach has a high tendency to double count for transportation disadvantage, increase the number of TDA population and only accounts each indicator for its individual effects. Instead, indicators that are identified based on a transportation disadvantage scenario will yield more accurate results. Keywords – transport disadvantage, the capability approach, accessibility, measuring urban transportation disadvantage, indicators selection Paper type – Academic Research Paper
Resumo:
Emotions play a central role in mediation as they help to define the scope and direction of a conflict. When a party to mediation expresses (and hence entrusts) their emotions to those present in a mediation, a mediator must do more than simply listen - they must attend to these emotions. Mediator empathy is an essential skill for communicating to a party that their feelings have been heard and understood, but it can lead mediators into trouble. Whilst there might exist a theoretical divide between the notions of empathy and sympathy, the very best characteristics of mediators (caring and compassionate nature) may see empathy and sympathy merge - resulting in challenges to mediator neutrality. This article first outlines the semantic difference between empathy and sympathy and the role that intrapsychic conflict can play in the convergence of these behavioural phenomena. It then defines emotional intelligence in the context of a mediation, suggesting that only the most emotionally intelligent mediators are able to emotionally connect with the parties, but maintain an impression of impartiality – the quality of remaining ‘attached yet detached’ to the process. It is argued that these emotionally intelligent mediators have the common qualities of strong self-awareness and emotional self-regulation.
Resumo:
It is widely contended that we live in a „world risk society‟, where risk plays a central and ubiquitous role in contemporary social life. A seminal contributor to this view is Ulrich Beck, who claims that our world is governed by dangers that cannot be calculated or insured against. For Beck, risk is an inherently unrestrained phenomenon, emerging from a core and pouring out from and under national borders, unaffected by state power. Beck‟s focus on risk's ubiquity and uncontrollability at an infra-global level means that there is a necessary evenness to the expanse of risk: a "universalization of hazards‟, which possess an inbuilt tendency towards globalisation. While sociological scholarship has examined the reach and impact of globalisation processes on the role and power of states, Beck‟s argument that economic risk is without territory and resistant to domestic policy has come under less appraisal. This is contestable: what are often described as global economic processes, on closer inspection, reveal degrees of territorial embeddedness. This not only suggests that "global‟ flows could sometimes be more appropriately explained as international, regional or even local processes, formed from and responsive to state strategies – but also demonstrates what can be missed if we overinflate the global. This paper briefly introduces two key principles of Beck's theory of risk society and positions them within a review of literature debating the novelty and degree of global economic integration and its impact on states pursuing domestic economic policies. In doing so, this paper highlights the value for future research to engage with questions such as "is economic risk really without territory‟ and "does risk produce convergence‟, not so much as a means of reducing Beck's thesis to a purely empirical analysis, but rather to avoid limiting our scope in understanding the complex relationship between risk and state.
Resumo:
Abstract During a survey of faba bean viruses in West Asia and North Africa a virus was identified as broad bean stain virus (BBSV) based on host reactions, electron microscopy, physical properties and serology. An antiserum to a Syrian isolate was prepared. With this antiserum both the direct double antibody sandwich ELISA (DAS-ELISA) and dot-ELISA were very sensitive in detecting BBSV in leaf extracts, ground whole seeds and germi nated embryos. Sens it i vity was not reduced when the two-day procedure was replaced by a one-day procedure. us i ng ELISA the vi rus was detected in 73 out of 589 faba bean samples with virus-like symptoms collected from Egypt (4 out of 70 samples tested), Lebanon (6/44) , Morocco (017), Sudan (19/254), Syria (36/145) and Tunisia (8/69). This is the first report of BBSV infection of faba bean in Lebanon, Sudan, Syria and Tunisia. speci es i ndi genous to Syri a were Fourteen wild legume susceptible to BBSV infection, with only two producing obvious symptoms. The virus was found to be seed transmitted ~n Vicia palaestina.
Resumo:
Recently, the numerical modelling and simulation for fractional partial differential equations (FPDE), which have been found with widely applications in modern engineering and sciences, are attracting increased attentions. The current dominant numerical method for modelling of FPDE is the explicit Finite Difference Method (FDM), which is based on a pre-defined grid leading to inherited issues or shortcomings. This paper aims to develop an implicit meshless approach based on the radial basis functions (RBF) for numerical simulation of time fractional diffusion equations. The discrete system of equations is obtained by using the RBF meshless shape functions and the strong-forms. The stability and convergence of this meshless approach are then discussed and theoretically proven. Several numerical examples with different problem domains are used to validate and investigate accuracy and efficiency of the newly developed meshless formulation. The results obtained by the meshless formations are also compared with those obtained by FDM in terms of their accuracy and efficiency. It is concluded that the present meshless formulation is very effective for the modelling and simulation for FPDE.
Resumo:
Efficient and effective urban management systems for Ubiquitous Eco Cities require having intelligent and integrated management mechanisms. This integration includes bringing together economic, socio-cultural and urban development with a well orchestrated, transparent and open decision-making system and necessary infrastructure and technologies. In Ubiquitous Eco Cities telecommunication technologies play an important role in monitoring and managing activities via wired and wireless networks. Particularly, technology convergence creates new ways in which information and telecommunication technologies are used and formed the backbone of urban management. The 21st Century is an era where information has converged, in which people are able to access a variety of services, including internet and location based services, through multi-functional devices and provides new opportunities in the management of Ubiquitous Eco Cities. This chapter discusses developments in telecommunication infrastructure and trends in convergence technologies and their implications on the management of Ubiquitous Eco Cities.