86 resultados para variable interest entity
Resumo:
A biplot, which is the multivariate generalization of the two-variable scatterplot, can be used to visualize the results of many multivariate techniques, especially those that are based on the singular value decomposition. We consider data sets consisting of continuous-scale measurements, their fuzzy coding and the biplots that visualize them, using a fuzzy version of multiple correspondence analysis. Of special interest is the way quality of fit of the biplot is measured, since it is well-known that regular (i.e., crisp) multiple correspondence analysis seriously under-estimates this measure. We show how the results of fuzzy multiple correspondence analysis can be defuzzified to obtain estimated values of the original data, and prove that this implies an orthogonal decomposition of variance. This permits a measure of fit to be calculated in the familiar form of a percentage of explained variance, which is directly comparable to the corresponding fit measure used in principal component analysis of the original data. The approach is motivated initially by its application to a simulated data set, showing how the fuzzy approach can lead to diagnosing nonlinear relationships, and finally it is applied to a real set of meteorological data.
Resumo:
This article studies the effects of interest rate restrictions on loan allocation. The British governmenttightened the usury laws in 1714, reducing the maximum permissible interest rate from 6% to5%. A sample of individual loan transactions reveals that average loan size and minimum loan sizeincreased strongly, while access to credit worsened for those with little social capital. Collateralisedcredits, which had accounted for a declining share of total lending, returned to their former role ofprominence. Our results suggest that the usury laws distorted credit markets significantly; we findno evidence that they offered a form of Pareto-improving social insurance.
Resumo:
Constant interest rate (CIR) projections are often criticized on the grounds that they are inconsistent with the existence of a unique equilibrium in a variety of forward-looking models. This note shows howto construct CIR projections that are not subject to that criticism, using a standard New Keynesian model as a reference framework.
Resumo:
This paper presents several applications to interest rate risk managementbased on a two-factor continuous-time model of the term structure of interestrates previously presented in Moreno (1996). This model assumes that defaultfree discount bond prices are determined by the time to maturity and twofactors, the long-term interest rate and the spread (difference between thelong-term rate and the short-term (instantaneous) riskless rate). Several newmeasures of ``generalized duration" are presented and applied in differentsituations in order to manage market risk and yield curve risk. By means ofthese measures, we are able to compute the hedging ratios that allows us toimmunize a bond portfolio by means of options on bonds. Focusing on thehedging problem, it is shown that these new measures allow us to immunize abond portfolio against changes (parallel and/or in the slope) in the yieldcurve. Finally, a proposal of solution of the limitations of conventionalduration by means of these new measures is presented and illustratednumerically.
Resumo:
We consider the joint visualization of two matrices which have common rowsand columns, for example multivariate data observed at two time pointsor split accord-ing to a dichotomous variable. Methods of interest includeprincipal components analysis for interval-scaled data, or correspondenceanalysis for frequency data or ratio-scaled variables on commensuratescales. A simple result in matrix algebra shows that by setting up thematrices in a particular block format, matrix sum and difference componentscan be visualized. The case when we have more than two matrices is alsodiscussed and the methodology is applied to data from the InternationalSocial Survey Program.
Resumo:
This paper proposes a dynamic framework to study the timing of balance of paymentscrises. The model incorporates two main ingredients: (i) investors have private information; (ii)investors interact in a dynamic setting, weighing the high returns on domestic assets against the incentives to pull out before the devaluation. The model shows that the presence of disaggregated information delays the onset of BOP crises, giving rise to discrete devaluations. It also shows that high interest rates can be eective in delaying and possibly avoiding the abandonment of the peg. The optimal policy is to raise interest rates sharply as fundamentals become very weak. However, this policy is time inconsistent, suggesting a role for commitment devices such as currency boards or IMF pressure.
Resumo:
This paper presents a two--factor model of the term structure ofinterest rates. We assume that default free discount bond prices aredetermined by the time to maturity and two factors, the long--term interestrate and the spread (difference between the long--term rate and theshort--term (instantaneous) riskless rate). Assuming that both factorsfollow a joint Ornstein--Uhlenbeck process, a general bond pricing equationis derived. We obtain a closed--form expression for bond prices andexamine its implications for the term structure of interest rates. We alsoderive a closed--form solution for interest rate derivatives prices. Thisexpression is applied to price European options on discount bonds andmore complex types of options. Finally, empirical evidence of the model'sperformance is presented.
Resumo:
In some markets, such as the market for drugs or for financial services, sellers have better information than buyersregarding the matching between the buyer's needs and the good's actual characteristics. Depending on the market structure,this may lead to conflicts of interest and/or the underprovision of information by the seller. This paper studies this issuein the market for financial services. The analysis presents a new model of competition between banks, as banks' pricecompetition influences the ensuing incentives for truthful information revelation. We compare two different firm structures,specialized banking, where financial institutions provide a unique financial product, and one-stop banking, where a financialinstitution is able to provide several financial products which are horizontally differentiated. We show first that, althoughconflicts of interest may prevent information disclosure under monopoly, competition forces full information provision forsufficiently high reputation costs. Second, in the presence of market power, one-stop banks will use information strategicallyto increase product differentiation and therefore will always provide reliable information and charge higher rices thanspecialized banks, thus providing a new justification for the creation of one-stop banks. Finally, we show that, ifindependent financial advisers are able to provide reliable information, this increases product differentiation and thereforemarket power, so that it is in the interest of financial intermediaries to promote external independent financial advice.
Resumo:
The criterion, based on the thermodynamics theory, that the climatic system tends to extremizesome function has suggested several studies. In particular, special attention has been devoted to the possibility that the climate reaches an extremal rate of planetary entropy production.Due to both radiative and material effects contribute to total planetary entropy production,climatic simulations obtained at the extremal rates of total, radiative or material entropy production appear to be of interest in order to elucidate which of the three extremal assumptions behaves more similar to current data. In the present paper, these results have been obtainedby applying a 2-dimensional (2-Dim) horizontal energy balance box-model, with a few independent variables (surface temperature, cloud-cover and material heat fluxes). In addition, climatic simulations for current conditions by assuming a fixed cloud-cover have been obtained. Finally,sensitivity analyses for both variable and fixed cloud models have been carried out
Resumo:
Aquest projecte es basa en el modelatge i simulació de sistemes utilitzant un simulador digital, i pretén ser una guia docent com a eina d’ajuda per a una assignatura que, a priori, s’impartirà a la Universitat de Vic. La simulació és una tècnica que permet representar el comportament de processos (físics, productius, de serveis, etc.) sense necessitat d’accedir al sistema real. Per analitzar, estudiar i millorar el comportament d’un sistema mitjançant la tècnica de la simulació digital és necessari primer desenvolupar un model conceptual que descrigui les variables d’interès, i després implementar-lo en un simulador per poder analitzar els resultats. ARENA és el software de simulació que s’estudia en aquest projecte i es presenta com una eina que permet la descripció complerta de l’experiència que una entitat desenvolupa a l’interior del sistema mentre flueix a través d’aquest. En concret s’utilitza la versió ARENA 10.0. Pel que fa a l’estructura del projecte, primerament s’introdueixen conceptes teòrics referents a la simulació, així com avantatges i inconvenients i els camps d’aplicació de la simulació. Seguidament i ja centrats en l’Arena, s’analitza un exemple senzill per començar-ne a veure el funcionament. Posteriorment, es van estudiant varis exemples amb complexitat progressiva. Aquests exemples es desenvolupen pas a pas de manera que es puguin anar provant amb el simulador. En el transcurs del projecte es van estudiant les eines de l’Arena i les seves possibilitats, així com els resultats obtinguts i les interpretacions d’aquests. Aquest projecte pretén, doncs, donar conceptes introductoris en el camp de la simulació en general, i, en particular, descriure eines bàsiques sobre el funcionament de l’Arena.
Resumo:
We propose an algorithm that extracts image features that are consistent with the 3D structure of the scene. The features can be robustly tracked over multiple views and serve as vertices of planar patches that suitably represent scene surfaces, while reducing the redundancy in the description of 3D shapes. In other words, the extracted features will off er good tracking properties while providing the basis for 3D reconstruction with minimum model complexity
Resumo:
We present I-band deep CCD exposures of the fields of galactic plane radio variables. An optical counterpart, based on positional coincidence, has been found for 15 of the 27 observed program objects. The Johnson I magnitude of the sources identified is in the range 18-21.
Resumo:
A considerable fraction of the -ray sources discovered with the Energetic Gamma-Ray Experiment Telescope (EGRET) remain unidentified. The EGRET sources that have been properly identified are either pulsars or variable sources at both radio and gamma-ray wavelengths. Most of the variable sources are strong radio blazars. However, some low galactic-latitude EGRET sources, with highly variable -ray emission, lack any evident counterpart according to the radio data available until now. Aims. The primary goal of this paper is to identify and characterise the potential radio counterparts of four highly variable -ray sources in the galactic plane through mapping the radio surroundings of the EGRET confidence contours and determining the variable radio sources in the field whenever possible. Methods. We have carried out a radio exploration of the fields of the selected EGRET sources using the Giant Metrewave Radio Telescope (GMRT) interferometer at 21 cm wavelength, with pointings being separated by months. Results. We detected a total of 151 radio sources. Among them, we identified a few radio sources whose flux density has apparently changed on timescales of months. Despite the limitations of our search, their possible variability makes these objects a top-priority target for multiwavelength studies of the potential counterparts of highly variable, unidentified gamma-ray sources.
Resumo:
This paper describes the development and applications of a super-resolution method, known as Super-Resolution Variable-Pixel Linear Reconstruction. The algorithm works combining different lower resolution images in order to obtain, as a result, a higher resolution image. We show that it can make significant spatial resolution improvements to satellite images of the Earth¿s surface allowing recognition of objects with size approaching the limiting spatial resolution of the lower resolution images. The algorithm is based on the Variable-Pixel Linear Reconstruction algorithm developed by Fruchter and Hook, a well-known method in astronomy but never used for Earth remote sensing purposes. The algorithm preserves photometry, can weight input images according to the statistical significance of each pixel, and removes the effect of geometric distortion on both image shape and photometry. In this paper, we describe its development for remote sensing purposes, show the usefulness of the algorithm working with images as different to the astronomical images as the remote sensing ones, and show applications to: 1) a set of simulated multispectral images obtained from a real Quickbird image; and 2) a set of multispectral real Landsat Enhanced Thematic Mapper Plus (ETM+) images. These examples show that the algorithm provides a substantial improvement in limiting spatial resolution for both simulated and real data sets without significantly altering the multispectral content of the input low-resolution images, without amplifying the noise, and with very few artifacts.
Resumo:
Microquasars are binary star systems with relativistic radio-emitting jets. They are potential sources of cosmic rays and can be used to elucidate the physics of relativistic jets. We report the detection of variable gamma-ray emission above 100 gigaelectron volts from the microquasar LS I 61 + 303. Six orbital cycles were recorded. Several detections occur at a similar orbital phase, which suggests that the emission is periodic. The strongest gamma-ray emission is not observed when the two stars are closest to one another, implying a strong orbital modulation of the emission or absorption processes.