999 resultados para Natural extension
Resumo:
The aim of this paper is to extend the classical envelope theorem from scalar to vector differential programming. The obtained result allows us to measure the quantitative behaviour of a certain set of optimal values (not necessarily a singleton) characterized to become minimum when the objective function is composed with a positive function, according to changes of any of the parameters which appear in the constraints. We show that the sensitivity of the program depends on a Lagrange multiplier and its sensitivity.
Resumo:
The two-node tandem Jackson network serves as a convenient reference model for the analysis and testing of different methodologies and techniques in rare event simulation. In this paper we consider a new approach to efficiently estimate the probability that the content of the second buffer exceeds some high level L before it becomes empty, starting from a given state. The approach is based on a Markov additive process representation of the buffer processes, leading to an exponential change of measure to be used in an importance sampling procedure. Unlike changes of measures proposed and studied in recent literature, the one derived here is a function of the content of the first buffer. We prove that when the first buffer is finite, this method yields asymptotically efficient simulation for any set of arrival and service rates. In fact, the relative error is bounded independent of the level L; a new result which is not established for any other known method. When the first buffer is infinite, we propose a natural extension of the exponential change of measure for the finite buffer case. In this case, the relative error is shown to be bounded (independent of L) only when the second server is the bottleneck; a result which is known to hold for some other methods derived through large deviations analysis. When the first server is the bottleneck, experimental results using our method seem to suggest that the relative error is bounded linearly in L.
Resumo:
During the whole of the nineteenth century and the first decades of the twentieth century the transatlantic book trade was plainly asymmetrical, with Brazil seen by book vendors in Portugal as a natural extension of their market, destined to import books — a situation due largely to the incipient nature of Brazilian book production. However, the rapid development of the Brazilian printing and publishing industry in the first half of the twentieth century brought profound changes in the circulation of print material and in the traditional movements in the transatlantic book trade. Aware of those changes, some publishers and booksellers sought ways of expanding their businesses, by creating new openings for the circulation of books between the two countries. Taking the particular case of António de Sousa Pinto and his three Luso-Brazilian publishing ventures of the 1940s (Livros de Portugal, Edições Dois Mundos and Livros do Brasil), this article tries to understand the way publishers behaved in bringing together the two sides of the Atlantic closer together for the Lusophone book.
Resumo:
Glasgow Mathematical Journal, nº 50 (2008), p. 325-333
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores
Resumo:
This paper analyzes the linkages between the credibility of a target zone regime, the volatility of the exchange rate, and the width of the band where the exchange rate is allowed to fluctuate. These three concepts should be related since the band width induces a trade-off between credibility and volatility. Narrower bands should give less scope for the exchange rate to fluctuate but may make agents perceive a larger probability of realignment which by itself should increase the volatility of the exchange rate. We build a model where this trade-off is made explicit. The model is used to understand the reduction in volatility experienced by most EMS countries after their target zones were widened on August 1993. As a natural extension, the model also rationalizes the existence of non-official, implicit target zones (or fear of floating), suggested by some authors.
Resumo:
The proposed game is a natural extension of the Shapley and Shubik Assignment Game to the case where each seller owns a set of different objets instead of only one indivisible object. We propose definitions of pairwise stability and group stability that are adapted to our framework. Existence of both pairwise and group stable outcomes is proved. We study the structure of the group stable set and we finally prove that the set of group stable payoffs forms a complete lattice with one optimal group stable payoff for each side of the market.
Resumo:
To allow society to treat unequal alternatives distinctly we propose a natural extension of Approval Voting by relaxing the assumption of neutrality. According to this extension, every alternative receives ex-ante a non-negative and finite weight. These weights may differ across alternatives. Given the voting decisions of every individual (individuals are allowed to vote for, or approve of, as many alternatives as they wish to), society elects all alternatives for which the product of total number of votes times exogenous weight is maximal. Our main result is an axiomatic characterization of this voting procedure.
Resumo:
Let Γ be a finite graph and G be the corresponding free partially commutative group. In this paper we study subgroups generated by vertices of the graph Γ, which we call canonical parabolic subgroups. A natural extension of the definition leads to canonical quasiparabolic subgroups. It is shown that the centralisers of subsets of G are the conjugates of canonical quasiparabolic centralisers satisfying certain graph theoretic conditions.
Resumo:
A final-state-effects formalism suitable to analyze the high-momentum response of Fermi liquids is presented and used to study the dynamic structure function of liquid 3He. The theory, developed as a natural extension of the Gersch-Rodriguez formalism, incorporates the Fermi statistics explicitly through a new additive term which depends on the semidiagonal two-body density matrix. The use of a realistic momentum distribution, calculated using the diffusion Monte Carlo method, and the inclusion of this additive correction allows for good agreement with available deep-inelastic neutron scattering data.
Resumo:
Tämä diplomityö tehtiin osana Componenta Cast Componentsin kolmivuotista toimitusketjujen kehitysprojektia. Työn tavoitteena oli kuvata tyypillinen yrityksen sisäinen toimitusketjuprosessi ja tehdä alustava suorituskykyanalyysi valimon ja konepajan väliseen logistiseen prosessiin liittyen. Tarkoituksena oli myös löytää kehityskohteita materiaali- ja tietovirtojen hallinnassa näiden tuotantoyksiköiden välillä. Logistiikkaan, toimitusketjujen hallintaan ja toimitusketjun suorituskyvyn mittaamiseen liittyvän kirjallisuustutkimuksen sekä käytännön perusteella valittiin sopivat analyysimenetelmät. Näitä menetelmiä hyödynnettiin tilaustoimitus – prosessin kuvaamisessa sekä suorituskyvyn analysoinnissa yrityksen sisäisessä toimitusketjussa. Luonnollisena jatkona kehitettiin ja pantiin käytäntöön toimitusketjua synkronoiva imutyyppinen tuotannon- ja materiaalinohjausmenetelmä. Diplomityöprojektin aikana kehitettiin myös apuvälineet käyttöönotetun menetelmän asianmukaista hyödyntämistä varten. Diplomityöprojektissa otettiin ensimmäiset askeleet kohti integroitua sisäistä toimitusketjua. Uuden tuotannon- ja materiaalinohjausmenetelmän standardisointi muihin menetelmiin yhdistettynä, sekä toimitusketjun avainmittarien jatkokehitys on jo alkanut. Läpimenoaikoja lyhentämällä ja synkronoidun, läpinäkyvän kysyntä-tarjontaketjun avulla integroitumisen astetta voidaan nostaa edelleen. Poikkiorganisatorinen kehitys ja johtaminen toimitusketjussa on avainedellytys menestykseen.
Resumo:
The Dudding group is interested in the application of Density Functional Theory (DFT) in developing asymmetric methodologies, and thus the focus of this dissertation will be on the integration of these approaches. Several interrelated subsets of computer aided design and implementation in catalysis have been addressed during the course of these studies. The first of the aims rested upon the advancement of methodologies for the synthesis of biological active C(1)-chiral 3-methylene-indan-1-ols, which in practice lead to the use of a sequential asymmetric Yamamoto-Sakurai-Hosomi allylation/Mizoroki Heck reaction sequence. An important aspect of this work was the utilization of ortho-substituted arylaldehyde reagents which are known to be a problematic class of substrates for existing asymmetric allylation approaches. The second phase of my research program lead to the further development of asymmetric allylation methods using o-arylaldehyde substrates for synthesis of chiral C(3)-substituted phthalides. Apart from the de novo design of these chemistries in silico, which notably utilized water-tolerant, inexpensive, and relatively environmental benign indium metal, this work represented the first computational study of a stereoselective indium-mediated process. Following from these discoveries was the advent of a related, yet catalytic, Ag(I)-catalyzed approach for preparing C(3)-substituted phthalides that from a practical standpoint was complementary in many ways. Not only did this new methodology build upon my earlier work with the integrated (experimental/computational) use of the Ag(I)-catalyzed asymmetric methods in synthesis, it provided fundamental insight arrived at through DFT calculations, regarding the Yamamoto-Sakurai-Hosomi allylation. The development of ligands for unprecedented asymmetric Lewis base catalysis, especially asymmetric allylations using silver and indium metals, followed as a natural extension from these earlier discoveries. To this end, forthcoming as well was the advancement of a family of disubstituted (N-cyclopropenium guanidine/N-imidazoliumyl substituted cyclopropenylimine) nitrogen adducts that has provided fundamental insight into chemical bonding and offered an unprecedented class of phase transfer catalysts (PTC) having far-reaching potential. Salient features of these disubstituted nitrogen species is unprecedented finding of a cyclopropenium based C-H•••πaryl interaction, as well, the presence of a highly dissociated anion projected them to serve as a catalyst promoting fluorination reactions. Attracted by the timely development of these disubstituted nitrogen adducts my last studies as a PhD scholar has addressed the utility of one of the synthesized disubstituted nitrogen adducts as a valuable catalyst for benzylation of the Schiff base N-diphenyl methylene glycine ethyl ester. Additionally, the catalyst was applied for benzylic fluorination, emerging from this exploration was successful fluorination of benzyl bromide and its derivatives in high yields. A notable feature of this protocol is column-free purification of the product and recovery of the catalyst to use in a further reaction sequence.
Resumo:
The goal of this paper is to contribute to the economic literature on ethnic and cultural diversity by proposing a new index that is informationally richer and more flexible than the commonly used ‘ethno-linguistic fractionalization’ (ELF) index. We characterize a measure of diversity among individuals that takes as a primitive the individuals, as opposed to ethnic groups, and uses information on the extent of similarity among them. Compared to existing indices, our measure does not require that individuals are pre-assigned to exogenously determined categories or groups. We show that our generalized index is a natural extension of ELF and is also simple to compute. We also provide an empirical illustration of how our index can be operationalized and what difference it makes as compared to the standard ELF index. This application pertains to the pattern of fractionalization in the United States.
Resumo:
Les modèles à sur-représentation de zéros discrets et continus ont une large gamme d'applications et leurs propriétés sont bien connues. Bien qu'il existe des travaux portant sur les modèles discrets à sous-représentation de zéro et modifiés à zéro, la formulation usuelle des modèles continus à sur-représentation -- un mélange entre une densité continue et une masse de Dirac -- empêche de les généraliser afin de couvrir le cas de la sous-représentation de zéros. Une formulation alternative des modèles continus à sur-représentation de zéros, pouvant aisément être généralisée au cas de la sous-représentation, est présentée ici. L'estimation est d'abord abordée sous le paradigme classique, et plusieurs méthodes d'obtention des estimateurs du maximum de vraisemblance sont proposées. Le problème de l'estimation ponctuelle est également considéré du point de vue bayésien. Des tests d'hypothèses classiques et bayésiens visant à déterminer si des données sont à sur- ou sous-représentation de zéros sont présentées. Les méthodes d'estimation et de tests sont aussi évaluées au moyen d'études de simulation et appliquées à des données de précipitation agrégées. Les diverses méthodes s'accordent sur la sous-représentation de zéros des données, démontrant la pertinence du modèle proposé. Nous considérons ensuite la classification d'échantillons de données à sous-représentation de zéros. De telles données étant fortement non normales, il est possible de croire que les méthodes courantes de détermination du nombre de grappes s'avèrent peu performantes. Nous affirmons que la classification bayésienne, basée sur la distribution marginale des observations, tiendrait compte des particularités du modèle, ce qui se traduirait par une meilleure performance. Plusieurs méthodes de classification sont comparées au moyen d'une étude de simulation, et la méthode proposée est appliquée à des données de précipitation agrégées provenant de 28 stations de mesure en Colombie-Britannique.
Resumo:
This paper proposes and implements a new methodology for forecasting time series, based on bicorrelations and cross-bicorrelations. It is shown that the forecasting technique arises as a natural extension of, and as a complement to, existing univariate and multivariate non-linearity tests. The formulations are essentially modified autoregressive or vector autoregressive models respectively, which can be estimated using ordinary least squares. The techniques are applied to a set of high-frequency exchange rate returns, and their out-of-sample forecasting performance is compared to that of other time series models