942 resultados para bandwidth pricing
Resumo:
This paper proposes a multicast implementation based on adaptive routing with anticipated calculation. Three different cost measures for a point-to-multipoint connection: bandwidth cost, connection establishment cost and switching cost can be considered. The application of the method based on pre-evaluated routing tables makes possible the reduction of bandwidth cost and connection establishment cost individually
Resumo:
The authors focus on one of the methods for connection acceptance control (CAC) in an ATM network: the convolution approach. With the aim of reducing the cost in terms of calculation and storage requirements, they propose the use of the multinomial distribution function. This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements. This in turn makes possible a simple deconvolution process. Moreover, under certain conditions additional improvements may be achieved
Resumo:
Neste trabalho, discute-se a fixação de taxas de retorno de concessões no Brasil, com aplicação específica ao caso da metodologia da Agência Nacional de Transportes Terrestres (ANTT). Mostra-se a inadequação da regulamentação vigente, baseada no conceito de taxa interna de retorno (TIR), e não de custo de oportunidade do capital. A partir de um exemplo com dados referentes ao auge da crise financeira internacional (dezembro de 2008), evidencia-se também a falta de lógica decorrente da utilização de retornos e preços passados na estimação de taxas de retorno, um procedimento comum a toda a área de concessões de serviços públicos no Brasil. Propõe-se uma metodologia alternativa cujos resultados são sensíveis às condições correntes de mercado de capitais, que produz resultados coerentes com a situação então vigente.
Resumo:
Economic evaluation of health care interventions has experienced a strong growth over the past decade and is increasingly present as a support tool in the decisions making process on public funding of health services and pricing in European countries. A necessary element using them is that agents that perform economic evaluations have minimum rules with agreement on methodological aspects. Although there are methodological issues in which there is a high degree of consensus, there are others in which there is no such degree of agreement being closest to the normative field or have experienced significant methodological advances in recent years. In this first article of a series of three, we will discuss on the perspective of analysis and assessment of costs in economic evaluation of health interventions using the technique Metaplan. Finally, research lines are proposed to overcome the identified discrepancies.
Resumo:
Next Generation Access Networks (NGAN) are the new step forward to deliver broadband services and to facilitate the integration of different technologies. It is plausible to assume that, from a technological standpoint, the Future Internet will be composed of long-range high-speed optical networks; a number of wireless networks at the edge; and, in between, several access technologies, among which, the Passive Optical Networks (xPON) are very likely to succeed, due to their simplicity, low-cost, and increased bandwidth. Among the different PON technologies, the Ethernet-PON (EPON) is the most promising alternative to satisfy operator and user needs, due to its cost, flexibility and interoperability with other technologies. One of the most interesting challenges in such technologies relates to the scheduling and allocation of resources in the upstream (shared) channel. The aim of this research project is to study and evaluate current contributions and propose new efficient solutions to address the resource allocation issues in Next Generation EPON (NG-EPON). Key issues in this context are future end-user needs, integrated quality of service (QoS) support and optimized service provisioning for real time and elastic flows. This project will unveil research opportunities, issue recommendations and propose novel mechanisms associated with the convergence within heterogeneous access networks and will thus serve as a basis for long-term research projects in this direction. The project has served as a platform for the generation of new concepts and solutions that were published in national and international conferences, scientific journals and also in book chapter. We expect some more research publications in addition to the ones mentioned to be generated in a few months.
Resumo:
The development of the economic evaluation of health care interventions has become a support tool in making decisions on pricing and reimbursement of new health interventions. The increasingly extensive application of these techniques has led to the identification of particular situations in which, for various reasons, it may be reasonable to take into account special considerations when applying the general principles of economic evaluation. In this article, which closes a series of three, we will discuss, using the Metaplan technique, about the economic evaluation of health interventions in special situations such as rare diseases and end of life treatments, as well as consideration of externalities in assessments, finally pointing out some research areas to solve the main problems identified in these fields.
Resumo:
This work analyzes whether the relationship between risk and returns predicted by the Capital Asset Pricing Model (CAPM) is valid in the Brazilian stock market. The analysis is based on discrete wavelet decomposition on different time scales. This technique allows to analyze the relationship between different time horizons, since the short-term ones (2 to 4 days) up to the long-term ones (64 to 128 days). The results indicate that there is a negative or null relationship between systemic risk and returns for Brazil from 2004 to 2007. As the average excess return of a market portfolio in relation to a risk-free asset during that period was positive, it would be expected this relationship to be positive. That is, higher systematic risk should result in higher excess returns, which did not occur. Therefore, during that period, appropriate compensation for systemic risk was not observed in the Brazilian market. The scales that proved to be most significant to the risk-return relation were the first three, which corresponded to short-term time horizons. When treating differently, year-by-year, and consequently separating positive and negative premiums, some relevance is found, during some years, in the risk/return relation predicted by the CAPM. However, this pattern did not persist throughout the years. Therefore, there is not any evidence strong enough confirming that the asset pricing follows the model.
Resumo:
RESUMONo artigo, reexaminam-se as estratégias de momento a fim de verificar se a falta de evidências quanto a sua lucratividade no mercado brasileiro pode estar relacionada às quebras que elas experimentam durante as crises, conforme reportado por Daniel e Moskowitz. Para tanto, utilizou-se o teste t-student com o intuito de comparar os retornos médios auferidos pela carteira de momento dentro e fora das crises financeiras entre janeiro de 1997 e março de 2014. A partir dos resultados, demonstra-se que, em linha com o reportado para outros mercados, a carteira experimenta quebras durante as crises, ao passo que proporciona retornos positivos e significativos nos demais períodos, mesmo após o controle para os fatores de risco dos modelos do Capital Asset Pricing Model (CAPM) e de Fama-French. Esses achados indicam que a falta de evidências quanto à lucratividade dessas estratégias não implica o entendimento do mercado brasileiro como uma exceção, mas pode ser explicada pela quebra das carteiras de momento durante as crises, que anulam grande parte dos retornos positivos auferidos por essa estratégia em outros períodos.
Resumo:
La tomodensitométrie (CT) est une technique d'imagerie dont l'intérêt n'a cessé de croître depuis son apparition dans le début des années 70. Dans le domaine médical, son utilisation est incontournable à tel point que ce système d'imagerie pourrait être amené à devenir victime de son succès si son impact au niveau de l'exposition de la population ne fait pas l'objet d'une attention particulière. Bien évidemment, l'augmentation du nombre d'examens CT a permis d'améliorer la prise en charge des patients ou a rendu certaines procédures moins invasives. Toutefois, pour assurer que le compromis risque - bénéfice soit toujours en faveur du patient, il est nécessaire d'éviter de délivrer des doses non utiles au diagnostic.¦Si cette action est importante chez l'adulte elle doit être une priorité lorsque les examens se font chez l'enfant, en particulier lorsque l'on suit des pathologies qui nécessitent plusieurs examens CT au cours de la vie du patient. En effet, les enfants et jeunes adultes sont plus radiosensibles. De plus, leur espérance de vie étant supérieure à celle de l'adulte, ils présentent un risque accru de développer un cancer radio-induit dont la phase de latence peut être supérieure à vingt ans. Partant du principe que chaque examen radiologique est justifié, il devient dès lors nécessaire d'optimiser les protocoles d'acquisitions pour s'assurer que le patient ne soit pas irradié inutilement. L'avancée technologique au niveau du CT est très rapide et depuis 2009, de nouvelles techniques de reconstructions d'images, dites itératives, ont été introduites afin de réduire la dose et améliorer la qualité d'image.¦Le présent travail a pour objectif de déterminer le potentiel des reconstructions itératives statistiques pour réduire au minimum les doses délivrées lors d'examens CT chez l'enfant et le jeune adulte tout en conservant une qualité d'image permettant le diagnostic, ceci afin de proposer des protocoles optimisés.¦L'optimisation d'un protocole d'examen CT nécessite de pouvoir évaluer la dose délivrée et la qualité d'image utile au diagnostic. Alors que la dose est estimée au moyen d'indices CT (CTDIV0| et DLP), ce travail a la particularité d'utiliser deux approches radicalement différentes pour évaluer la qualité d'image. La première approche dite « physique », se base sur le calcul de métriques physiques (SD, MTF, NPS, etc.) mesurées dans des conditions bien définies, le plus souvent sur fantômes. Bien que cette démarche soit limitée car elle n'intègre pas la perception des radiologues, elle permet de caractériser de manière rapide et simple certaines propriétés d'une image. La seconde approche, dite « clinique », est basée sur l'évaluation de structures anatomiques (critères diagnostiques) présentes sur les images de patients. Des radiologues, impliqués dans l'étape d'évaluation, doivent qualifier la qualité des structures d'un point de vue diagnostique en utilisant une échelle de notation simple. Cette approche, lourde à mettre en place, a l'avantage d'être proche du travail du radiologue et peut être considérée comme méthode de référence.¦Parmi les principaux résultats de ce travail, il a été montré que les algorithmes itératifs statistiques étudiés en clinique (ASIR?, VEO?) ont un important potentiel pour réduire la dose au CT (jusqu'à-90%). Cependant, par leur fonctionnement, ils modifient l'apparence de l'image en entraînant un changement de texture qui pourrait affecter la qualité du diagnostic. En comparant les résultats fournis par les approches « clinique » et « physique », il a été montré que ce changement de texture se traduit par une modification du spectre fréquentiel du bruit dont l'analyse permet d'anticiper ou d'éviter une perte diagnostique. Ce travail montre également que l'intégration de ces nouvelles techniques de reconstruction en clinique ne peut se faire de manière simple sur la base de protocoles utilisant des reconstructions classiques. Les conclusions de ce travail ainsi que les outils développés pourront également guider de futures études dans le domaine de la qualité d'image, comme par exemple, l'analyse de textures ou la modélisation d'observateurs pour le CT.¦-¦Computed tomography (CT) is an imaging technique in which interest has been growing since it first began to be used in the early 1970s. In the clinical environment, this imaging system has emerged as the gold standard modality because of its high sensitivity in producing accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase of the number of CT examinations performed has raised concerns about the potential negative effects of ionizing radiation on the population. To insure a benefit - risk that works in favor of a patient, it is important to balance image quality and dose in order to avoid unnecessary patient exposure.¦If this balance is important for adults, it should be an absolute priority for children undergoing CT examinations, especially for patients suffering from diseases requiring several follow-up examinations over the patient's lifetime. Indeed, children and young adults are more sensitive to ionizing radiation and have an extended life span in comparison to adults. For this population, the risk of developing cancer, whose latency period exceeds 20 years, is significantly higher than for adults. Assuming that each patient examination is justified, it then becomes a priority to optimize CT acquisition protocols in order to minimize the delivered dose to the patient. Over the past few years, CT advances have been developing at a rapid pace. Since 2009, new iterative image reconstruction techniques, called statistical iterative reconstructions, have been introduced in order to decrease patient exposure and improve image quality.¦The goal of the present work was to determine the potential of statistical iterative reconstructions to reduce dose as much as possible without compromising image quality and maintain diagnosis of children and young adult examinations.¦The optimization step requires the evaluation of the delivered dose and image quality useful to perform diagnosis. While the dose is estimated using CT indices (CTDIV0| and DLP), the particularity of this research was to use two radically different approaches to evaluate image quality. The first approach, called the "physical approach", computed physical metrics (SD, MTF, NPS, etc.) measured on phantoms in well-known conditions. Although this technique has some limitations because it does not take radiologist perspective into account, it enables the physical characterization of image properties in a simple and timely way. The second approach, called the "clinical approach", was based on the evaluation of anatomical structures (diagnostic criteria) present on patient images. Radiologists, involved in the assessment step, were asked to score image quality of structures for diagnostic purposes using a simple rating scale. This approach is relatively complicated to implement and also time-consuming. Nevertheless, it has the advantage of being very close to the practice of radiologists and is considered as a reference method.¦Primarily, this work revealed that the statistical iterative reconstructions studied in clinic (ASIR? and VECO have a strong potential to reduce CT dose (up to -90%). However, by their mechanisms, they lead to a modification of the image appearance with a change in image texture which may then effect the quality of the diagnosis. By comparing the results of the "clinical" and "physical" approach, it was showed that a change in texture is related to a modification of the noise spectrum bandwidth. The NPS analysis makes possible to anticipate or avoid a decrease in image quality. This project demonstrated that integrating these new statistical iterative reconstruction techniques can be complex and cannot be made on the basis of protocols using conventional reconstructions. The conclusions of this work and the image quality tools developed will be able to guide future studies in the field of image quality as texture analysis or model observers dedicated to CT.
Resumo:
It is well documented that reducing blood pressure (BP) in hypertensive individuals reduces the risk of cardiovascular (CV) events. Despite this, many patients with hypertension remain untreated or inadequately treated, and fail to reach the recommended BP goals. Suboptimal BP control, whilst arising from multiple causes, is often due to poor patient compliance and/or persistence, and results in a significant health and economic burden on society. The use of fixed-dose combinations (FDCs) for the treatment of hypertension has the potential to increase patient compliance and persistence. When compared with antihypertensive monotherapies, FDCs may also offer equivalent or better efficacy, and the same or improved tolerability. As a result, FDCs have the potential to reduce both the CV event rates and the non-drug healthcare costs associated with hypertension. When FDCs are adopted for the treatment of hypertension, issues relating to copayment, formulary restrictions and therapeutic reference pricing must be addressed.
Resumo:
Abstract This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc. While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively. To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer. Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner. To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms. To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system. At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.
Resumo:
Over thirty years ago, Leamer (1983) - among many others - expressed doubts about the quality and usefulness of empirical analyses for the economic profession by stating that "hardly anyone takes data analyses seriously. Or perhaps more accurately, hardly anyone takes anyone else's data analyses seriously" (p.37). Improvements in data quality, more robust estimation methods and the evolution of better research designs seem to make that assertion no longer justifiable (see Angrist and Pischke (2010) for a recent response to Leamer's essay). The economic profes- sion and policy makers alike often rely on empirical evidence as a means to investigate policy relevant questions. The approach of using scientifically rigorous and systematic evidence to identify policies and programs that are capable of improving policy-relevant outcomes is known under the increasingly popular notion of evidence-based policy. Evidence-based economic policy often relies on randomized or quasi-natural experiments in order to identify causal effects of policies. These can require relatively strong assumptions or raise concerns of external validity. In the context of this thesis, potential concerns are for example endogeneity of policy reforms with respect to the business cycle in the first chapter, the trade-off between precision and bias in the regression-discontinuity setting in chapter 2 or non-representativeness of the sample due to self-selection in chapter 3. While the identification strategies are very useful to gain insights into the causal effects of specific policy questions, transforming the evidence into concrete policy conclusions can be challenging. Policy develop- ment should therefore rely on the systematic evidence of a whole body of research on a specific policy question rather than on a single analysis. In this sense, this thesis cannot and should not be viewed as a comprehensive analysis of specific policy issues but rather as a first step towards a better understanding of certain aspects of a policy question. The thesis applies new and innovative identification strategies to policy-relevant and topical questions in the fields of labor economics and behavioral environmental economics. Each chapter relies on a different identification strategy. In the first chapter, we employ a difference- in-differences approach to exploit the quasi-experimental change in the entitlement of the max- imum unemployment benefit duration to identify the medium-run effects of reduced benefit durations on post-unemployment outcomes. Shortening benefit duration carries a double- dividend: It generates fiscal benefits without deteriorating the quality of job-matches. On the contrary, shortened benefit durations improve medium-run earnings and employment possibly through containing the negative effects of skill depreciation or stigmatization. While the first chapter provides only indirect evidence on the underlying behavioral channels, in the second chapter I develop a novel approach that allows to learn about the relative impor- tance of the two key margins of job search - reservation wage choice and search effort. In the framework of a standard non-stationary job search model, I show how the exit rate from un- employment can be decomposed in a way that is informative on reservation wage movements over the unemployment spell. The empirical analysis relies on a sharp discontinuity in unem- ployment benefit entitlement, which can be exploited in a regression-discontinuity approach to identify the effects of extended benefit durations on unemployment and survivor functions. I find evidence that calls for an important role of reservation wage choices for job search be- havior. This can have direct implications for the optimal design of unemployment insurance policies. The third chapter - while thematically detached from the other chapters - addresses one of the major policy challenges of the 21st century: climate change and resource consumption. Many governments have recently put energy efficiency on top of their agendas. While pricing instru- ments aimed at regulating the energy demand have often been found to be short-lived and difficult to enforce politically, the focus of energy conservation programs has shifted towards behavioral approaches - such as provision of information or social norm feedback. The third chapter describes a randomized controlled field experiment in which we discuss the effective- ness of different types of feedback on residential electricity consumption. We find that detailed and real-time feedback caused persistent electricity reductions on the order of 3 to 5 % of daily electricity consumption. Also social norm information can generate substantial electricity sav- ings when designed appropriately. The findings suggest that behavioral approaches constitute effective and relatively cheap way of improving residential energy-efficiency.
Resumo:
El creciente uso de dispositivos móviles y el gran avance en la mejora de las aplicaciones y sistemas inalámbricos ha impulsado la demanda de filtros paso banda miniaturizados, que trabajen a altas frecuencias y tengan unas prestaciones elevadas. Los filtros basados en resonadores Bulk Acoustic Wave (BAW) están siendo la mejor alternativa a los filtros Surface Acoustic Wave (SAW), ya que funcionan a frecuencias superiores, pueden trabajar a mayores niveles de potencia y son compatibles con la tecnología CMOS. El filtro en escalera, que utiliza resonadores BAW, es de momento la mejor opción, debido a su facilidad de diseño y su bajo coste de fabricación. Aunque el filtro con resonadores acoplados (CRF) presenta mejores prestaciones como mayor ancho de banda, menor tamaño y conversión de modos. El problema de este tipo de filtros reside en su complejidad de diseño y su elevado coste. Este trabajo lleva a cabo el diseño de un CRF a partir de unas especificaciones bastante estrictas, demostrando sus altas prestaciones a pesar de su mayor inconveniente: el coste de fabricación.
Resumo:
This paper applies random matrix theory to obtain analytical characterizations of the capacity of correlated multiantenna channels. The analysis is not restricted to the popular separable correlation model, but rather it embraces a more general representation that subsumesmost of the channel models that have been treated in the literature. For arbitrary signal-to-noise ratios (SNR), the characterization is conducted in the regime of large numbers of antennas. For the low- and high-SNR regions, in turn, we uncover compact capacity expansions that are valid for arbitrary numbers of antennas and that shed insight on how antenna correlation impacts the tradeoffs between power, bandwidth and rate.
Resumo:
This study outlines several possible structures for livestock revenue insurance. The policies take the form of an exotic option—an Asian basket option. The actuarially fair premiums for these policies are equal to the prices of the options they represent. Due to the complexity of pricing Asian basket options, we have combined two techniques for pricing options to reach the actuarially fair premiums. Projected premiums, producer welfare, and program efficiency are evaluated for the insurance products and existing market tools. Using efficiency ratios and certainty equivalent returns, we compare the insurance policies to strategies involving existing futures and options.