52 resultados para time-frequency distribution (TFD)
Resumo:
Given the urgence of a new paradigm in wireless digital trasmission which should allow for higher bit rate, lower latency and tigher delay constaints, it has been proposed to investigate the fundamental building blocks that at the circuital/device level, will boost the change towards a more efficient network architecture, with high capacity, higher bandwidth and a more satisfactory end user experience. At the core of each transciever, there are inherently analog devices capable of providing the carrier signal, the oscillators. It is strongly believed that many limitations in today's communication protocols, could be relieved by permitting high carrier frequency radio transmission, and having some degree of reconfigurability. This led us to studying distributed oscillator architectures which work in the microwave range and possess wideband tuning capability. As microvave oscillators are essentially nonlinear devices, a full nonlinear analyis, synthesis, and optimization had to be considered for their implementation. Consequently, all the most used nonlinear numerical techniques in commercial EDA software had been reviewed. An application of all the aforementioned techniques has been shown, considering a systems of three coupled oscillator ("triple push" oscillator) in which the stability of the various oscillating modes has been studied. Provided that a certain phase distribution is maintained among the oscillating elements, this topology permits a rise in the output power of the third harmonic; nevertheless due to circuit simmetry, "unwanted" oscillating modes coexist with the intenteded one. Starting with the necessary background on distributed amplification and distributed oscillator theory, the design of a four stage reverse mode distributed voltage controlled oscillator (DVCO) using lumped elments has been presented. All the design steps have been reported and for the first time a method for an optimized design with reduced variations in the output power has been presented. Ongoing work is devoted to model a wideband DVCO and to implement a frequency divider.
Resumo:
El objetivo de la presente investigación fue analizar la correspondencia entre los resultados de una evaluación de tierras con la distribución real de los cultivos. Para ello la aptitud biofísica de las tierras se comparó con diferentes tipologías de frecuencia de ocurrencia de los cultivos y rotaciones derivadas de mapas de cultivos multitemporales. La investigación fue llevada a cabo en el distrito de riego de Flumen (33.000 ha), localizado en el valle del Ebro (NE España). La evaluación de tierras se basó en una cartografía de suelos 1:100.000, según el esquema FAO, para los principales cultivos presentes en el área de estudio (alfalfa, cereales de invierno, maíz, arroz y girasol). Se utilizaron tres mapas de frecuencia de cultivos y un mapa de rotaciones, derivado de una serie temporal de imágenes Landsat TM y ETM+ del periodo 1993-2000, y se compararon con los mapas de aptitud de tierras para los diferentes cultivos. Se analizó estadísticamente (Pearson χ2, Cramer V, Gamma y Somers D) la relación entre los dos tipos de variables. Los resultados muestran la existencia de una relación significativa (P=0,001) entre la localización de los cultivos y la idoneidad de las tierras, excepto de cultivos oportunistas como el girasol, muy influenciado por las subvenciones en el periodo estudiado. Las rotaciones basadas en la alfalfa muestran los mayores porcentajes (52%) de ocupación en las tierras más aptas para la agricultura en el área de estudio. El presente enfoque multitemporal de análisis de la información ofrece una visión más real que la comparación entre un mapa de evaluación de tierras y un mapa de cultivos de una fecha determinada, cuando se valora el grado de acuerdo entre las recomendaciones sobre la aptitud de las tierras y los cultivos realmente cultivados por los agricultores.
Resumo:
Emissions distribution is a focus variable for the design of future international agreements to tackle global warming. This paper specifically analyses the future path of emissions distribution and its determinants in different scenarios. Whereas our analysis is driven by tools which are typically applied in the income distribution literature and which have recently been applied to the analysis of CO2 emissions distribution, a new methodological approach is that our study is driven by simulations run with a popular regionalised optimal growth climate change model over the 1995-2105 period. We find that the architecture of environmental policies, the implementation of flexible mechanisms and income concentration are key determinants of emissions distribution over time. In particular we find a robust positive relationship between measures of inequalities.
Resumo:
An abundant scientific literature about climate change economics points out that the future participation of developing countries in international environmental policies will depend on their amount of pay offs inside and outside specific agreements. These studies are aimed at analyzing coalitions stability typically through a game theoretical approach. Though these contributions represent a corner stone in the research field investigating future plausible international coalitions and the reasons behind the difficulties incurred over time to implement emissions stabilizing actions, they cannot disentangle satisfactorily the role that equality play in inducing poor regions to tackle global warming. If we focus on the Stern Review findings stressing that climate change will generate heavy damages and policy actions will be costly in a finite time horizon, we understand why there is a great incentive to free ride in order to exploit benefits from emissions reduction efforts of others. The reluctance of poor countries in joining international agreements is mainly supported by historical responsibility of rich regions in generating atmospheric carbon concentration, whereas rich countries claim that emissions stabilizing policies will be effective only when developing countries will join them.Scholars recently outline that a perceived fairness in the distribution of emissions would facilitate a wide spread participation in international agreements. In this paper we overview the literature about distributional aspects of emissions by focusing on those contributions investigating past trends of emissions distribution through empirical data and future trajectories through simulations obtained by integrated assessment models. We will explain methodologies used to elaborate data and the link between real data and those coming from simulations. Results from this strand of research will be interpreted in order to discuss future negotiations for post Kyoto agreements that will be the focus of the next. Conference of the Parties in Copenhagen at the end of 2009. A particular attention will be devoted to the role that technological change will play in affecting the distribution of emissions over time and to how spillovers and experience diffusion could influence equality issues and future outcomes of policy negotiations.
Resumo:
We investigate the transition to synchronization in the Kuramoto model with bimodal distributions of the natural frequencies. Previous studies have concluded that the model exhibits a hysteretic phase transition if the bimodal distribution is close to a unimodal one, due to the shallowness the central dip. Here we show that proximity to the unimodal-bimodal border does not necessarily imply hysteresis when the width, but not the depth, of the central dip tends to zero. We draw this conclusion from a detailed study of the Kuramoto model with a suitable family of bimodal distributions.
Resumo:
This paper study repeated games where the time repetitions of the stage game are not known or controlled by the players. We call this feature random monitoring. Kawamori's (2004) shows that perfect random monitoring is always better than the canonical case. Surprisingly, when the monitoring is public, the result is less clear-cut and does not generalize in a straightforward way. Unless the public signals are sufficiently informative about player's actions and/or players are patient enough. In addition to a discount effect, that tends to consistently favor the provision of incentives, we found an information effect, associated with the time uncertainty on the distribution of public signals. Whether payoff improvements are or not possible, depends crucially on the direction and strength of these effects. JEL: C73, D82, D86. KEYWORDS: Repeated Games, Frequent Monitoring, Random Public Monitoring, Moral Hazard, Stochastic Processes.
Resumo:
Reaching and educating the masses to the benefit of all of mankind is the ultimate goal and through the use of this technology facility/tool many can be reached in their own language, in their own community, in their own time and at their own pace. Making this content available to those who will benefit from the information, is vital. These people who want to consume the content are not necessarily that interested in the qualification, they need the information. Making the content available in an auditory format may also help those who may not be as literate as others. The uses of audio/ recorded lessons have a number of uses and should not just be seen as a medium for content distribution to distant communities. Recording lectures makes it possible for a lecturer to present lectures to a vast number of students, while just presenting the lecture once.
Resumo:
This paper focuses on one of the methods for bandwidth allocation in an ATM network: the convolution approach. The convolution approach permits an accurate study of the system load in statistical terms by accumulated calculations, since probabilistic results of the bandwidth allocation can be obtained. Nevertheless, the convolution approach has a high cost in terms of calculation and storage requirements. This aspect makes real-time calculations difficult, so many authors do not consider this approach. With the aim of reducing the cost we propose to use the multinomial distribution function: the enhanced convolution approach (ECA). This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements and makes a simple deconvolution process possible. The ECA is used in connection acceptance control, and some results are presented
Resumo:
The space subdivision in cells resulting from a process of random nucleation and growth is a subject of interest in many scientific fields. In this paper, we deduce the expected value and variance of these distributions while assuming that the space subdivision process is in accordance with the premises of the Kolmogorov-Johnson-Mehl-Avrami model. We have not imposed restrictions on the time dependency of nucleation and growth rates. We have also developed an approximate analytical cell size probability density function. Finally, we have applied our approach to the distributions resulting from solid phase crystallization under isochronal heating conditions
Resumo:
The dynamics of homogeneously heated granular gases which fragment due to particle collisions is analyzed. We introduce a kinetic model which accounts for correlations induced at the grain collisions and analyze both the kinetics and relevant distribution functions these systems develop. The work combines analytical and numerical studies based on direct simulation Monte Carlo calculations. A broad family of fragmentation probabilities is considered, and its implications for the system kinetics are discussed. We show that generically these driven materials evolve asymptotically into a dynamical scaling regime. If the fragmentation probability tends to a constant, the grain number diverges at a finite time, leading to a shattering singularity. If the fragmentation probability vanishes, then the number of grains grows monotonously as a power law. We consider different homogeneous thermostats and show that the kinetics of these systems depends weakly on both the grain inelasticity and driving. We observe that fragmentation plays a relevant role in the shape of the velocity distribution of the particles. When the fragmentation is driven by local stochastic events, the longvelocity tail is essentially exponential independently of the heating frequency and the breaking rule. However, for a Lowe-Andersen thermostat, numerical evidence strongly supports the conjecture that the scaled velocity distribution follows a generalized exponential behavior f (c)~exp (−cⁿ), with n ≈1.2, regarding less the fragmentation mechanisms
Resumo:
Several airline consolidation events have recently been completed both in Europe and in the United States. The model we develop considers two airlines operating hub-and-spoke networks, using different hubs to connect the same spoke airports. We assume the airlines to be vertically differentiated, which allows us to distinguish between primary and secondary hubs. We conclude that this differentiation in air services becomes more accentuated after consolidation, with an increased number of flights being channeled through the primary hub. However, congestion can act as a brake on the concentration of flight frequency in the primary hub following consolidation. Our empirical application involves an analysis of Delta s network following its merger with Northwest. We find evidence consistent with an increase in the importance of Delta s primary hubs at the expense of its secondary airports. We also find some evidence suggesting that the carrier chooses to divert traffic away from those hub airports that were more prone to delays prior to the merger, in particular New York s JFK airport. Keywords: primary hub; secondary hub; airport congestion; airline consolidation; airline networks JEL Classi fication Numbers: D43; L13; L40; L93; R4
Resumo:
In the past, sensors networks in cities have been limited to fixed sensors, embedded in particular locations, under centralised control. Today, new applications can leverage wireless devices and use them as sensors to create aggregated information. In this paper, we show that the emerging patterns unveiled through the analysis of large sets of aggregated digital footprints can provide novel insights into how people experience the city and into some of the drivers behind these emerging patterns. We particularly explore the capacity to quantify the evolution of the attractiveness of urban space with a case study of in the area of the New York City Waterfalls, a public art project of four man-made waterfalls rising from the New York Harbor. Methods to study the impact of an event of this nature are traditionally based on the collection of static information such as surveys and ticket-based people counts, which allow to generate estimates about visitors’ presence in specific areas over time. In contrast, our contribution makes use of the dynamic data that visitors generate, such as the density and distribution of aggregate phone calls and photos taken in different areas of interest and over time. Our analysis provides novel ways to quantify the impact of a public event on the distribution of visitors and on the evolution of the attractiveness of the points of interest in proximity. This information has potential uses for local authorities, researchers, as well as service providers such as mobile network operators.
Resumo:
In this paper, we introduce a pilot-aided multipath channel estimator for Multiple-Input Multiple-Output (MIMO) Orthogonal Frequency Division Multiplexing (OFDM) systems. Typical estimation algorithms assume the number of multipath components and delays to be known and constant, while theiramplitudes may vary in time. In this work, we focus on the more realistic assumption that also the number of channel taps is unknown and time-varying. The estimation problem arising from this assumption is solved using Random Set Theory (RST), which is a probability theory of finite sets. Due to the lack of a closed form of the optimal filter, a Rao-Blackwellized Particle Filter (RBPF) implementation of the channel estimator is derived. Simulation results demonstrate the estimator effectiveness.
Resumo:
This study examines parental time investment in their children, distinguishing between developmental and non-developmental care. Our analyses centre on three influential determinants: educational background, marital homogamy, and spouses' relative bargaining power. We find that the emphasis on quality care time is correlated with parents' education, and that marital homogamy reduces couple specialization, but only among the highly educated. In line with earlier research, we identify gendered parental behaviour. The presence of boys is an important condition for fathers' time dedication, but primarly among lower educated fathers. To the extent that parental stimulation is decisive for child outcomes, our findings suggest the persistence of important inequalities. This emerges through our special attention to behavioural differences across the educational distribution among households.