942 resultados para bandwidth pricing
Resumo:
The performance of a device based on modified injection-locking techniques is studied by means of numerical simulations. The device incorporates master and slave configurations, each one with a DFB laser and an electroabsortion modulator (EAM). This arrangement allows the generation of high peak power, narrow optical pulses according to a periodic or pseudorandom bit stream provided by a current signal generator. The device is able to considerably increase the modulation bandwidth of free-running gain-switched semiconductor lasers using multiplexing in the time domain. Opportunities for integration in small packages or single chips are discussed.
Resumo:
A method for characterizing the microroughness of samples in optical coating technology is developed. Measurements over different spatial-frequency ranges are composed into a single power spectral density (PSD) covering a large bandwidth. This is followed by the extraction of characteristic parameters through fitting of the PSD to a suitable combination of theoretical models. The method allows us to combine microroughness measurements performed with different techniques, and the fitting procedure can be adapted to any behavior of a combined PSD. The method has been applied to a set of ion-beam-sputtered fluoride vacuum-UV coatings with increasing number of alternative low- and high-index layers. Conclusions about roughness development and microstructural growth are drawn.
Resumo:
Preface The starting point for this work and eventually the subject of the whole thesis was the question: how to estimate parameters of the affine stochastic volatility jump-diffusion models. These models are very important for contingent claim pricing. Their major advantage, availability T of analytical solutions for characteristic functions, made them the models of choice for many theoretical constructions and practical applications. At the same time, estimation of parameters of stochastic volatility jump-diffusion models is not a straightforward task. The problem is coming from the variance process, which is non-observable. There are several estimation methodologies that deal with estimation problems of latent variables. One appeared to be particularly interesting. It proposes the estimator that in contrast to the other methods requires neither discretization nor simulation of the process: the Continuous Empirical Characteristic function estimator (EGF) based on the unconditional characteristic function. However, the procedure was derived only for the stochastic volatility models without jumps. Thus, it has become the subject of my research. This thesis consists of three parts. Each one is written as independent and self contained article. At the same time, questions that are answered by the second and third parts of this Work arise naturally from the issues investigated and results obtained in the first one. The first chapter is the theoretical foundation of the thesis. It proposes an estimation procedure for the stochastic volatility models with jumps both in the asset price and variance processes. The estimation procedure is based on the joint unconditional characteristic function for the stochastic process. The major analytical result of this part as well as of the whole thesis is the closed form expression for the joint unconditional characteristic function for the stochastic volatility jump-diffusion models. The empirical part of the chapter suggests that besides a stochastic volatility, jumps both in the mean and the volatility equation are relevant for modelling returns of the S&P500 index, which has been chosen as a general representative of the stock asset class. Hence, the next question is: what jump process to use to model returns of the S&P500. The decision about the jump process in the framework of the affine jump- diffusion models boils down to defining the intensity of the compound Poisson process, a constant or some function of state variables, and to choosing the distribution of the jump size. While the jump in the variance process is usually assumed to be exponential, there are at least three distributions of the jump size which are currently used for the asset log-prices: normal, exponential and double exponential. The second part of this thesis shows that normal jumps in the asset log-returns should be used if we are to model S&P500 index by a stochastic volatility jump-diffusion model. This is a surprising result. Exponential distribution has fatter tails and for this reason either exponential or double exponential jump size was expected to provide the best it of the stochastic volatility jump-diffusion models to the data. The idea of testing the efficiency of the Continuous ECF estimator on the simulated data has already appeared when the first estimation results of the first chapter were obtained. In the absence of a benchmark or any ground for comparison it is unreasonable to be sure that our parameter estimates and the true parameters of the models coincide. The conclusion of the second chapter provides one more reason to do that kind of test. Thus, the third part of this thesis concentrates on the estimation of parameters of stochastic volatility jump- diffusion models on the basis of the asset price time-series simulated from various "true" parameter sets. The goal is to show that the Continuous ECF estimator based on the joint unconditional characteristic function is capable of finding the true parameters. And, the third chapter proves that our estimator indeed has the ability to do so. Once it is clear that the Continuous ECF estimator based on the unconditional characteristic function is working, the next question does not wait to appear. The question is whether the computation effort can be reduced without affecting the efficiency of the estimator, or whether the efficiency of the estimator can be improved without dramatically increasing the computational burden. The efficiency of the Continuous ECF estimator depends on the number of dimensions of the joint unconditional characteristic function which is used for its construction. Theoretically, the more dimensions there are, the more efficient is the estimation procedure. In practice, however, this relationship is not so straightforward due to the increasing computational difficulties. The second chapter, for example, in addition to the choice of the jump process, discusses the possibility of using the marginal, i.e. one-dimensional, unconditional characteristic function in the estimation instead of the joint, bi-dimensional, unconditional characteristic function. As result, the preference for one or the other depends on the model to be estimated. Thus, the computational effort can be reduced in some cases without affecting the efficiency of the estimator. The improvement of the estimator s efficiency by increasing its dimensionality faces more difficulties. The third chapter of this thesis, in addition to what was discussed above, compares the performance of the estimators with bi- and three-dimensional unconditional characteristic functions on the simulated data. It shows that the theoretical efficiency of the Continuous ECF estimator based on the three-dimensional unconditional characteristic function is not attainable in practice, at least for the moment, due to the limitations on the computer power and optimization toolboxes available to the general public. Thus, the Continuous ECF estimator based on the joint, bi-dimensional, unconditional characteristic function has all the reasons to exist and to be used for the estimation of parameters of the stochastic volatility jump-diffusion models.
Resumo:
A new conceptualization of the investments is defended in accordance with the nature of the uncertainty. The results of this conceptualization offer a theory of the investment compatible with the concept of intrinsec value that is been able expressed in mathematical model. This model is an alternative to the named 'pricing models'.
Resumo:
En este artículo, a partir de la inversa de la matriz de varianzas y covarianzas se obtiene el modelo Esperanza-Varianza de Markowitz siguiendo un camino más corto y matemáticamente riguroso. También se obtiene la ecuación de equilibrio del CAPM de Sharpe.
Resumo:
This paper analyses the behaviour of pharmaceutical companies that face the threat of having their drugs excluded from reimbursement and the markets characterised also by price caps. We conclude that price elasticity of demand and cost differentials cause the price discounts which drug firms offer to health care organisations. Additionally, we conclude that price cap regulations affect the time path of prices, resulting in higher prices for new products and lower prices for old products.
Resumo:
This work carries out an empirical evaluation of the impact of the main mechanism for regulating the prices of medicines in the UK on a variety ofpharmaceutical price indices. The empirical evidence shows that the overall impact of the rate of return cap appears to have been slight or even null, and in any case that the impact would differ across therapeutic areas. These empiricalfindings suggest that the price regulation has managed to encourage UK-based firms¿ diversification in many therapeutic areas
Resumo:
We have explored the possibility of obtaining first-order permeability estimates for saturated alluvial sediments based on the poro-elastic interpretation of the P-wave velocity dispersion inferred from sonic logs. Modern sonic logging tools designed for environmental and engineering applications allow one for P-wave velocity measurements at multiple emitter frequencies over a bandwidth covering 5 to 10 octaves. Methodological considerations indicate that, for saturated unconsolidated sediments in the silt to sand range and typical emitter frequencies ranging from approximately 1 to 30 kHz, the observable velocity dispersion should be sufficiently pronounced to allow one for reliable first-order estimations of the permeability structure. The corresponding predictions have been tested on and verified for a borehole penetrating a typical surficial alluvial aquifer. In addition to multifrequency sonic logs, a comprehensive suite of nuclear and electrical logs, an S-wave log, a litholog, and a limited number laboratory measurements of the permeability from retrieved core material were also available. This complementary information was found to be essential for parameterizing the poro-elastic inversion procedure and for assessing the uncertainty and internal consistency of corresponding permeability estimates. Our results indicate that the thus obtained permeability estimates are largely consistent with those expected based on the corresponding granulometric characteristics, as well as with the available evidence form laboratory measurements. These findings are also consistent with evidence from ocean acoustics, which indicate that, over a frequency range of several orders-of-magnitude, the classical theory of poro-elasticity is generally capable of explaining the observed P-wave velocity dispersion in medium- to fine-grained seabed sediments
Resumo:
The report includes a recap of the initiatives relating to the Network and highlights usage, including the increased need for bandwidth and access to high-speed Internet by ICN users.
Resumo:
Generic or own brand products were initially only lesser expensive copies of the branded label alternative, but nowadays, pricing alone is not enough in order to survive in the Fast Moving Consumer Goods (FMCG) or Consumer Packaged Goods (CPG)markets. With this in mind manufacturers of generic brands have adapted to this rapidlygrowing niche by investing in design and marketing during the initial phase in order to be perceived as having a quality product comparable to that of the branded products. In addition, they have gone further ahead with a second phase and resorted to innovativeproduct differentiation strategies and even pure innovation in many cases. These strategies have granted generic brands constantly increasing market shares and a position of equals relative to national brands.Using previous analyses and case studies, this paper will provide conceptual and empirical evidence to explain the surprisingly fast growth and penetration of generic supermarket brands, which in their relatively short lifespan, have grown to rival the historical market leaders, the branded products. According to this analysis, the main conclusion is that the growth in generic brands can be explained not only by price competition, but also by the use of innovative product differentiation strategies.
Resumo:
This issue review highlights the federal mandate requiring all non-federal public safety license holders on frequencies rating from 72 to 512 megahertz to reduce their operating bandwidth from 25 kilohertz to 12.5 kilohertz narrowband channels by January 1, 2013.
Resumo:
The federal government mandated that all non-federal public safety license holders on the frequencies ranging from 150 to 512 megahertz reduce their operating bandwidth from 25 kilohertz to 12.5 kilohertz. Narrowband channels must update their operating licenses by January 1, 2013. Failure to do so will result in the loss of communication capabilities and fines. This issue review analyzes the impact to state agencies of the federal mandate requiring all two-way radio systems and some paging networks, including those used by public-safety agencies, to meet the new narrowband requirements by January 1, 2013. This issue review does not address the impact to local communications systems.
Resumo:
The report includes a recap of the initiatives relating to the Network and highlights usage, including the increased need for bandwidth and access to high-speed Internet by ICN users.
Resumo:
PURPOSE: The aim of this study was to develop models based on kernel regression and probability estimation in order to predict and map IRC in Switzerland by taking into account all of the following: architectural factors, spatial relationships between the measurements, as well as geological information. METHODS: We looked at about 240,000 IRC measurements carried out in about 150,000 houses. As predictor variables we included: building type, foundation type, year of construction, detector type, geographical coordinates, altitude, temperature and lithology into the kernel estimation models. We developed predictive maps as well as a map of the local probability to exceed 300 Bq/m(3). Additionally, we developed a map of a confidence index in order to estimate the reliability of the probability map. RESULTS: Our models were able to explain 28% of the variations of IRC data. All variables added information to the model. The model estimation revealed a bandwidth for each variable, making it possible to characterize the influence of each variable on the IRC estimation. Furthermore, we assessed the mapping characteristics of kernel estimation overall as well as by municipality. Overall, our model reproduces spatial IRC patterns which were already obtained earlier. On the municipal level, we could show that our model accounts well for IRC trends within municipal boundaries. Finally, we found that different building characteristics result in different IRC maps. Maps corresponding to detached houses with concrete foundations indicate systematically smaller IRC than maps corresponding to farms with earth foundation. CONCLUSIONS: IRC mapping based on kernel estimation is a powerful tool to predict and analyze IRC on a large-scale as well as on a local level. This approach enables to develop tailor-made maps for different architectural elements and measurement conditions and to account at the same time for geological information and spatial relations between IRC measurements.
Resumo:
This is a study of how transportation policy can be fashioned to improve Iowa's long-term economic prospects. The research focuses on the state level and covers pricing, resource allocation, investment, and other issues that directly affect the performance of public facilities that support transportation of goods and people to and from points in Iowa. Chapter 1 is an introduction. Chapter 2 begins with an assessment of how Iowa's economy is changing, both functionally and spatially. Commuting patterns and methods of goods movement are then discussed. The purpose of this analysis is to provide a context for the exploration of transportation policy issues in subsequent chapters. In Chapter 3 a framework is established for evaluating changes in transportation policies. A working definition of economic development is given and the role of government policies in making an area more attractive to economic activity is considered. Chapter 4 analyzes public policy options for Iowa's roads and highways. These policy options are intended to help the state compete for economic activity. Chapter 5 assesses alternative investment strategies for major navigational facilities on the upper Mississippi River. Chapter 6 examines major transportation policy issues in Iowa's agricultural sector. The current magnitude of agricultural shipments and the roles of several modes are presented. After focusing on issues related to railroad competitiveness, the analysis turns to how Iowa's rural roads should be financed. The need for joint investment and pricing decisions affecting waterways, railroads, and rural roads is stressed. Chapter 7 examines the current status of freight transportation in Iowa. An assessment is made of issues related to trucking and of intermodal transportation and its potential for cost-effective shipping to and from businesses in Iowa. Chapter 8 summarizes the key findings of this study, offering ten recommendations. These recommendations relate to transportation as a means of facilitating economic development.