994 resultados para Utility optimization


Relevância:

70.00% 70.00%

Publicador:

Resumo:

We address a portfolio optimization problem in a semi-Markov modulated market. We study both the terminal expected utility optimization on finite time horizon and the risk-sensitive portfolio optimization on finite and infinite time horizon. We obtain optimal portfolios in relevant cases. A numerical procedure is also developed to compute the optimal expected terminal utility for finite horizon problem.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We consider a joint power control and transmission scheduling problem in wireless networks with average power constraints. While the capacity region of a wireless network is convex, a characterization of this region is a hard problem. We formulate a network utility optimization problem involving time-sharing across different "transmission modes," where each mode corresponds to the set of power levels used in the network. The structure of the optimal solution is a time-sharing across a small set of such modes. We use this structure to develop an efficient heuristic approach to finding a suboptimal solution through column generation iterations. This heuristic approach converges quite fast in simulations, and provides a tool for wireless network planning.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cette thèse envisage un ensemble de méthodes permettant aux algorithmes d'apprentissage statistique de mieux traiter la nature séquentielle des problèmes de gestion de portefeuilles financiers. Nous débutons par une considération du problème général de la composition d'algorithmes d'apprentissage devant gérer des tâches séquentielles, en particulier celui de la mise-à-jour efficace des ensembles d'apprentissage dans un cadre de validation séquentielle. Nous énumérons les desiderata que des primitives de composition doivent satisfaire, et faisons ressortir la difficulté de les atteindre de façon rigoureuse et efficace. Nous poursuivons en présentant un ensemble d'algorithmes qui atteignent ces objectifs et présentons une étude de cas d'un système complexe de prise de décision financière utilisant ces techniques. Nous décrivons ensuite une méthode générale permettant de transformer un problème de décision séquentielle non-Markovien en un problème d'apprentissage supervisé en employant un algorithme de recherche basé sur les K meilleurs chemins. Nous traitons d'une application en gestion de portefeuille où nous entraînons un algorithme d'apprentissage à optimiser directement un ratio de Sharpe (ou autre critère non-additif incorporant une aversion au risque). Nous illustrons l'approche par une étude expérimentale approfondie, proposant une architecture de réseaux de neurones spécialisée à la gestion de portefeuille et la comparant à plusieurs alternatives. Finalement, nous introduisons une représentation fonctionnelle de séries chronologiques permettant à des prévisions d'être effectuées sur un horizon variable, tout en utilisant un ensemble informationnel révélé de manière progressive. L'approche est basée sur l'utilisation des processus Gaussiens, lesquels fournissent une matrice de covariance complète entre tous les points pour lesquels une prévision est demandée. Cette information est utilisée à bon escient par un algorithme qui transige activement des écarts de cours (price spreads) entre des contrats à terme sur commodités. L'approche proposée produit, hors échantillon, un rendement ajusté pour le risque significatif, après frais de transactions, sur un portefeuille de 30 actifs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Modelling and prediction of pedestrian routing behaviours within known built environments has recently attracted the attention of researchers across multiple disciplines, owing to the growing demand on urban resources and requirements for efficient use of public facilities. This study presents an investigation into pedestrians' routing behaviours within an indoor environment under normal, non-panic situations. A network-based method using constrained Delaunay triangulation is adopted, and a utility-based model employing dynamic programming is developed. The main contribution of this study is the formulation of an appropriate utility function that allows an effective application of dynamic programming to predict a series of consecutive waypoints within a built environment. The aim is to generate accurate sequence waypoints for the pedestrian walking path using only structural definitions of the environment as defined in a standard CAD format. The simulation results are benchmarked against those from the A* algorithm, and the outcome positively indicates the usefulness of the proposed method in predicting pedestrians' route selection activities. © 2014 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

„Risikomaße in der Finanzmathematik“ Der Value-at -Risk (VaR) ist ein Risikomaß, dessen Verwendung von der Bankenaufsicht gefordert wird. Der Vorteil des VaR liegt – als Quantil der Ertrags- oder Verlustverteilung - vor allem in seiner einfachen Interpretierbarkeit. Nachteilig ist, dass der linke Rand der Wahrscheinlichkeitsverteilung nicht beachtet wird. Darüber hinaus ist die Berechnung des VaR schwierig, da Quantile nicht additiv sind. Der größte Nachteil des VaR ist in der fehlenden Subadditivität zu sehen. Deswegen werden Alternativen wie Expected Shortfall untersucht. In dieser Arbeit werden zunächst finanzielle Risikomaße eingeführt und einige ihre grundlegenden Eigenschaften festgehalten. Wir beschäftigen uns mit verschiedenen parametrischen und nichtparametrischen Methoden zur Ermittlung des VaR, unter anderen mit ihren Vorteilen und Nachteilen. Des Weiteren beschäftigen wir uns mit parametrischen und nichtparametrischen Schätzern vom VaR in diskreter Zeit. Wir stellen Portfoliooptimierungsprobleme im Black Scholes Modell mit beschränktem VaR und mit beschränkter Varianz vor. Der Vorteil des erstens Ansatzes gegenüber dem zweiten wird hier erläutert. Wir lösen Nutzenoptimierungsprobleme in Bezug auf das Endvermögen mit beschränktem VaR und mit beschränkter Varianz. VaR sagt nichts über den darüber hinausgehenden Verlust aus, während dieser von Expected Shortfall berücksichtigt wird. Deswegen verwenden wir hier den Expected Shortfall anstelle des von Emmer, Korn und Klüppelberg (2001) betrachteten Risikomaßes VaR für die Optimierung des Portfolios im Black Scholes Modell.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study presents a theory of utility models based on aspiration levels, as well as the application of this theory to the planning of timber flow economics. The first part of the study comprises a derivation of the utility-theoretic basis for the application of aspiration levels. Two basic models are dealt with: the additive and the multiplicative. Applied here solely for partial utility functions, aspiration and reservation levels are interpreted as defining piecewisely linear functions. The standpoint of the choices of the decision-maker is emphasized by the use of indifference curves. The second part of the study introduces a model for the management of timber flows. The model is based on the assumption that the decision-maker is willing to specify a shape of income flow which is different from that of the capital-theoretic optimum. The utility model comprises four aspiration-based compound utility functions. The theory and the flow model are tested numerically by computations covering three forest holdings. The results show that the additive model is sensitive even to slight changes in relative importances and aspiration levels. This applies particularly to nearly linear production possibility boundaries of monetary variables. The multiplicative model, on the other hand, is stable because it generates strictly convex indifference curves. Due to a higher marginal rate of substitution, the multiplicative model implies a stronger dependence on forest management than the additive function. For income trajectory optimization, a method utilizing an income trajectory index is more efficient than one based on the use of aspiration levels per management period. Smooth trajectories can be attained by squaring the deviations of the feasible trajectories from the desired one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that in the time-domain acquisition of NMR data, signal-to-noise (S/N) improves as the square root of the number of transients accumulated. However, the amplitude of the measured signal varies during the time of detection, having a functional form dependent on the coherence detected. Matching the time spent signal averaging to the expected amplitude of the signal observed should also improve the detected signal-to-noise. Following this reasoning, Barna et al. (J Magn. Reson.75, 384, 1987) demonstrated the utility of exponential sampling in one- and two-dimensional NMR, using maximum-entropy methods to analyze the data. It is proposed here that for two-dimensional experiments the exponential sampling be replaced by exponential averaging. The data thus collected can be analyzed by standard fast-Fourier-transform routines. We demonstrate the utility of exponential averaging in 2D NOESY spectra of the protein ubiquitin, in which an enhanced SIN is observed. It is also shown that the method acquires delayed double-quantum-filtered COSY without phase distortion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address the problem of allocating a single divisible good to a number of agents. The agents have concave valuation functions parameterized by a scalar type. The agents report only the type. The goal is to find allocatively efficient, strategy proof, nearly budget balanced mechanisms within the Groves class. Near budget balance is attained by returning as much of the received payments as rebates to agents. Two performance criteria are of interest: the maximum ratio of budget surplus to efficient surplus, and the expected budget surplus, within the class of linear rebate functions. The goal is to minimize them. Assuming that the valuation functions are known, we show that both problems reduce to convex optimization problems, where the convex constraint sets are characterized by a continuum of half-plane constraints parameterized by the vector of reported types. We then propose a randomized relaxation of these problems by sampling constraints. The relaxed problem is a linear programming problem (LP). We then identify the number of samples needed for ``near-feasibility'' of the relaxed constraint set. Under some conditions on the valuation function, we show that value of the approximate LP is close to the optimal value. Simulation results show significant improvements of our proposed method over the Vickrey-Clarke-Groves (VCG) mechanism without rebates. In the special case of indivisible goods, the mechanisms in this paper fall back to those proposed by Moulin, by Guo and Conitzer, and by Gujar and Narahari, without any need for randomization. Extension of the proposed mechanisms to situations when the valuation functions are not known to the central planner are also discussed. Note to Practitioners-Our results will be useful in all resource allocation problems that involve gathering of information privately held by strategic users, where the utilities are any concave function of the allocations, and where the resource planner is not interested in maximizing revenue, but in efficient sharing of the resource. Such situations arise quite often in fair sharing of internet resources, fair sharing of funds across departments within the same parent organization, auctioning of public goods, etc. We study methods to achieve near budget balance by first collecting payments according to the celebrated VCG mechanism, and then returning as much of the collected money as rebates. Our focus on linear rebate functions allows for easy implementation. The resulting convex optimization problem is solved via relaxation to a randomized linear programming problem, for which several efficient solvers exist. This relaxation is enabled by constraint sampling. Keeping practitioners in mind, we identify the number of samples that assures a desired level of ``near-feasibility'' with the desired confidence level. Our methodology will occasionally require subsidy from outside the system. We however demonstrate via simulation that, if the mechanism is repeated several times over independent instances, then past surplus can support the subsidy requirements. We also extend our results to situations where the strategic users' utility functions are not known to the allocating entity, a common situation in the context of internet users and other problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Topology optimization methods have been shown to have extensive application in the design of microsystems. However, their utility in practical situations is restricted to predominantly planar configurations due to the limitations of most microfabrication techniques in realizing structures with arbitrary topologies in the direction perpendicular to the substrate. This study addresses the problem of synthesizing optimal topologies in the out-of-plane direction while obeying the constraints imposed by surface micromachining. A new formulation that achieves this by defining a design space that implicitly obeys the manufacturing constraints with a continuous design parameterization is presented in this paper. This is in contrast to including manufacturing cost in the objective function or constraints. The resulting solutions of the new formulation obtained with gradient-based optimization directly provide the photolithographic mask layouts. Two examples that illustrate the approach for the case of stiff structures are included.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

TCP performance degrades when end-to-end connections extend over wireless connections-links which are characterized by high bit error rate and intermittent connectivity. Such link characteristics can significantly degrade TCP performance as the TCP sender assumes wireless losses to be congestion losses resulting in unnecessary congestion control actions. Link errors can be reduced by increasing transmission power, code redundancy (FEC) or number of retransmissions (ARQ). But increasing power costs resources, increasing code redundancy reduces available channel bandwidth and increasing persistency increases end-to-end delay. The paper proposes a TCP optimization through proper tuning of power management, FEC and ARQ in wireless environments (WLAN and WWAN). In particular, we conduct analytical and numerical analysis taking into "wireless-aware" TCP) performance under different settings. Our results show that increasing power, redundancy and/or retransmission levels always improves TCP performance by reducing link-layer losses. However, such improvements are often associated with cost and arbitrary improvement cannot be realized without paying a lot in return. It is therefore important to consider some kind of net utility function that should be optimized, thus maximizing throughput at the least possible cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To evaluate the performance of the co-channel transmission based communication, we propose a new metric for area spectral efficiency (ASE) of interference limited ad-hoc network by assuming that the nodes are randomly distributed according to a Poisson point processes (PPP). We introduce a utility function, U = ASE/delay and derive the optimal ALOHA transmission probability p and the SIR threshold τ that jointly maximize the ASE and minimize the local delay. Finally, numerical results have been conducted to confirm that the joint optimization based on the U metric achieves a significant performance gain compared to conventional systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a particle swarm optimization (PSO) approach to support electricity producers for multiperiod optimal contract allocation. The producer risk preference is stated by a utility function (U) expressing the tradeoff between the expectation and variance of the return. Variance estimation and expected return are based on a forecasted scenario interval determined by a price range forecasting model developed by the authors. A certain confidence level is associated to each forecasted scenario interval. The proposed model makes use of contracts with physical (spot and forward) and financial (options) settlement. PSO performance was evaluated by comparing it with a genetic algorithm-based approach. This model can be used by producers in deregulated electricity markets but can easily be adapted to load serving entities and retailers. Moreover, it can easily be adapted to the use of other type of contracts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is generally challenging to determine end-to-end delays of applications for maximizing the aggregate system utility subject to timing constraints. Many practical approaches suggest the use of intermediate deadline of tasks in order to control and upper-bound their end-to-end delays. This paper proposes a unified framework for different time-sensitive, global optimization problems, and solves them in a distributed manner using Lagrangian duality. The framework uses global viewpoints to assign intermediate deadlines, taking resource contention among tasks into consideration. For soft real-time tasks, the proposed framework effectively addresses the deadline assignment problem while maximizing the aggregate quality of service. For hard real-time tasks, we show that existing heuristic solutions to the deadline assignment problem can be incorporated into the proposed framework, enriching their mathematical interpretation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La surveillance de l’influenza s’appuie sur un large spectre de données, dont les données de surveillance syndromique provenant des salles d’urgences. De plus en plus de variables sont enregistrées dans les dossiers électroniques des urgences et mises à la disposition des équipes de surveillance. L’objectif principal de ce mémoire est d’évaluer l’utilité potentielle de l’âge, de la catégorie de triage et de l’orientation au départ de l’urgence pour améliorer la surveillance de la morbidité liée aux cas sévères d’influenza. Les données d’un sous-ensemble des hôpitaux de Montréal ont été utilisées, d’avril 2006 à janvier 2011. Les hospitalisations avec diagnostic de pneumonie ou influenza ont été utilisées comme mesure de la morbidité liée aux cas sévères d’influenza, et ont été modélisées par régression binomiale négative, en tenant compte des tendances séculaires et saisonnières. En comparaison avec les visites avec syndrome d’allure grippale (SAG) totales, les visites avec SAG stratifiées par âge, par catégorie de triage et par orientation de départ ont amélioré le modèle prédictif des hospitalisations avec pneumonie ou influenza. Avant d’intégrer ces variables dans le système de surveillance de Montréal, des étapes additionnelles sont suggérées, incluant l’optimisation de la définition du syndrome d’allure grippale à utiliser, la confirmation de la valeur de ces prédicteurs avec de nouvelles données et l’évaluation de leur utilité pratique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis deals with the preparation and dielectric characterization of Poly aniline and its analogues in ISM band frequency of 2-4 GHz that includes part of the microwave region (300 MHz to 300 GHz) of the electromagnetic spectrum and an initial dielectric study in the high frequency [O.05MHz-13 MHz]. PolyaniIine has been synthesized by an in situ doping reaction under different temperature and in the presence of inorganic dopants such as HCl H2S04, HN03, HCl04 and organic dopants such as camphorsulphonic acid [CSA], toluenesulphonic acid {TSA) and naphthalenesulphonic acid [NSA]. The variation in dielectric properties with change in reaction temperature, dopants and frequency has been studied. The effect of codopants and microemulsions on the dielectric properties has also been studied in the ISM band. The ISM band of frequencies (2-4 GHz) is of great utility in Industrial, Scientific and Medical (ISM) applications. Microwave heating is a very efficient method of heating dielectric materials and is extensively used in industrial as well as household heating applications.