879 resultados para Subset search
Resumo:
A growing literature integrates theories of debt management into models of optimal fiscal policy. One promising theory argues that the composition of government debt should be chosen so that fluctuations in the market value of debt offset changes in expected future deficits. This complete market approach to debt management is valid even when the government only issues non-contingent bonds. A number of authors conclude from this approach that governments should issue long term debt and invest in short term assets. We argue that the conclusions of this approach are too fragile to serve as a basis for policy recommendations. This is because bonds at different maturities have highly correlated returns, causing the determination of the optimal portfolio to be ill-conditioned. To make this point concrete we examine the implications of this approach to debt management in various models, both analytically and using numerical methods calibrated to the US economy. We find the complete market approach recommends asset positions which are huge multiples of GDP. Introducing persistent shocks or capital accumulation only worsens this problem. Increasing the volatility of interest rates through habits partly reduces the size of these simulations we find no presumption that governments should issue long term debt ? policy recommendations can be easily reversed through small perturbations in the specification of shocks or small variations in the maturity of bonds issued. We further extend the literature by removing the assumption that governments every period costlessly repurchase all outstanding debt. This exacerbates the size of the required positions, worsens their volatility and in some cases produces instability in debt holdings. We conclude that it is very difficult to insulate fiscal policy from shocks by using the complete markets approach to debt management. Given the limited variability of the yield curve using maturities is a poor way to substitute for state contingent debt. The result is the positions recommended by this approach conflict with a number of features that we believe are important in making bond markets incomplete e.g allowing for transaction costs, liquidity effects, etc.. Until these features are all fully incorporated we remain in search of a theory of debt management capable of providing robust policy insights.
Resumo:
In order to upgrade the reliability of xenodiagnosis, attention has been directed towards population dynamics of the parasite, with particular interest for the following factors: 1. Parasite density which by itself is not a research objective, but by giving an accurate portrayal of parasite development and multiplication, has been incorporated in screening of bugs for xenodiagnosis. 2. On the assumption that food availability might increase parasite density, bugs from xenodiagnosis have been refed at biweekly intervals on chicken blood. 3. Infectivity rates and positives harbouring large parasite yields were based on gut infections, in which the parasite population comprised of all developmental forms was more abundant and easier to detect than in fecal infections, thus minimizing the probability of recording false negatives. 4. Since parasite density, low in the first 15 days of infection, increases rapidly in the following 30 days, the interval of 45 days has been adopted for routine examination of bugs from xenodiagnosis. By following the enumerated measures, all aiming to reduce false negative cases, we are getting closer to a reliable xenodiagnostic procedure. Upgrading the efficacy of xenodiagnosis is also dependent on the xenodiagnostic agent. Of 9 investigated vector species, Panstrongylus megistus deserves top priority as a xenodiagnostic agent. Its extraordinary capability to support fast development and vigorous multiplication of the few parasites, ingested from the host with chronic Chagas' disease, has been revealed by the strikingly close infectivity rates of 91.2% vs. 96.4% among bugs engorged from the same host in the chronic and acute phase of the disease respectively (Table V), the latter comporting an estimated number of 12.3 x 10[raised to the power of 3] parasites in the circulation at the time of xenodiagnosis, as reported previously by the authors (1982).
Resumo:
The recent findings on immunodiagnosis of schistosomiasis mansoni have shown that purified Schistosoma mansoni antigens do not provide maximum positivity. Therefore, the authors suggest the use of semi-purified antigens for diagnostic purposes. So far, no serological marker for cured patients as shown by negative stool examination was found. However, a tendency of IgG antibody titre decrease was observed, when egg antigen was used.
Resumo:
Tetrasomy, pentasomy, and hexasomy 8 (polysomy 8) are relatively rare compared to trisomy 8. Here we report on a series of 12 patients with acute myeloid leukemia (AML), myelodysplastic syndrome (MDS), or myeloproliferative disorder (MPD) associated with polysomy 8 as detected by conventional cytogenetics and fluorescence in situ hybridization (FISH). In an attempt to better characterize the clinical and hematological profile of this cytogenetic entity, our data were combined with those of 105 published patients. Tetrasomy 8 was the most common presentation of polysomy 8. In 60.7% of patients, polysomy 8 occurred as part of complex changes (16.2% with 11q23 rearrangements). No cryptic MLL rearrangements were found in cases in which polysomy 8 was the only karyotypic change. Our study demonstrates the existence of a polysomy 8 syndrome, which represents a subtype of AML, MDS, and MPD characterized by a high incidence of secondary diseases, myelomonocytic or monocytic involvement in AML and poor overall survival (6 months). Age significantly reduced median survival, but associated cytogenetic abnormalities did not modify it. Cytogenetic results further demonstrate an in vitro preferential growth of the cells with a high level of aneuploidy suggesting a selective advantage for polysomy 8 cells.
Resumo:
This paper develops methods for Stochastic Search Variable Selection (currently popular with regression and Vector Autoregressive models) for Vector Error Correction models where there are many possible restrictions on the cointegration space. We show how this allows the researcher to begin with a single unrestricted model and either do model selection or model averaging in an automatic and computationally efficient manner. We apply our methods to a large UK macroeconomic model.
Resumo:
CODEX SEARCH es un motor de recuperación de información especializado en derecho de extranjería que está basado en herramientas y conocimiento lingüísticos. Un motor o Sistema de Recuperación de Información (SRI) es un software capaz de localizar información en grandes colecciones documentales (entorno no trivial) en formato electrónico. Mediante un estudio previo se ha detectado que la extranjería es un ámbito discursivo en el que resulta difícil expresar la necesidad de información en términos de una consulta formal, objeto de los sistemas de recuperación actuales. Por lo tanto, para desarrollar un SRI eficiente en el dominio indicado no basta con emplear un modelo tradicional de RI, es decir, comparar los términos de la pregunta con los de la respuesta, básicamente porque no expresan implicaciones y porque no tiene que haber necesariamente una relación 1 a 1. En este sentido, la solución lingüística propuesta se basa en incorporar el conocimiento del especialista mediante la integración en el sistema de una librería de casos. Los casos son ejemplos de procedimientos aplicados por expertos a la solución de problemas que han ocurrido en la realidad y que han terminado en éxito o fracaso. Los resultados obtenidos en esta primera fase son muy alentadores pero es necesario continuar la investigación en este campo para mejorar el rendimiento del prototipo al que se puede acceder desde &http://161.116.36.139/~codex/&.
Resumo:
This paper shows how one of the developers of QWERTY continued to use the trade secret that underlay its development to seek further efficiency improvements after its introduction. It provides further evidence that this was the principle used to design QWERTY in the first place and adds further weight to arguments that QWERTY itself was a consequence of creative design and an integral part of a highly efficient system rather than an accident of history. This further serves to raise questions over QWERTY's forced servitude as 'paradigm case' of inferior standard in the path dependence literature. The paper also shows how complementarities in forms of intellectual property rights protection played integral roles in the development of QWERTY and the search for improvements on it, and also helped effectively conceal the source of the efficiency advantages that QWERTY helped deliver.
Resumo:
We consider a frictional two-sided matching market in which one side uses public cheap talk announcements so as to attract the other side. We show that if the first-price auction is adopted as the trading protocol, then cheap talk can be perfectly informative, and the resulting market outcome is efficient, constrained only by search frictions. We also show that the performance of an alternative trading protocol in the cheap-talk environment depends on the level of price dispersion generated by the protocol: If a trading protocol compresses (spreads) the distribution of prices relative to the first-price auction, then an efficient fully revealing equilibrium always (never) exists. Our results identify the settings in which cheap talk can serve as an efficient competitive instrument, in the sense that the central insights from the literature on competing auctions and competitive search continue to hold unaltered even without ex ante price commitment.
Resumo:
In a market in which sellers compete by posting mechanisms, we study how the properties of the meeting technology affect the mechanism that sellers select. In general, sellers have incentive to use mechanisms that are socially efficient. In our environment, sellers achieve this by posting an auction with a reserve price equal to their own valuation, along with a transfer that is paid by (or to) all buyers with whom the seller meets. However, we define a novel condition on meeting technologies, which we call “invariance,” and show that the transfer is equal to zero if and only if the meeting technology satisfies this condition.
Resumo:
We develop a life-cycle model of the labor market in which different worker-firm matches have different quality and the assignment of the right workers to the right firms is time consuming because of search and learning frictions. The rate at which workers move between unemployment, employment and across different firms is endogenous because search is directed and, hence, workers can choose whether to seek low-wage jobs that are easy to find or high-wage jobs that are hard to find. We calibrate our theory using data on labor market transitions aggregated across workers of different ages. We validate our theory by showing that it predicts quite well the pattern of labor market transitions for workers of different ages. Finally, we use our theory to decompose the age profiles of transition rates, wages and productivity into the effects of age variation in work-life expectancy, human capital and match quality.
Resumo:
This paper evaluates the effects of policy interventions on sectoral labour markets and the aggregate economy in a business cycle model with search and matching frictions. We extend the canonical model by including capital-skill complementarity in production, labour markets with skilled and unskilled workers and on-the-job-learning (OJL) within and across skill types. We first find that, the model does a good job at matching the cyclical properties of sectoral employment and the wage-skill premium. We next find that vacancy subsidies for skilled and unskilled jobs lead to output multipliers which are greater than unity with OJL and less than unity without OJL. In contrast, the positive output effects from cutting skilled and unskilled income taxes are close to zero. Finally, we find that the sectoral and aggregate effects of vacancy subsidies do not depend on whether they are financed via public debt or distorting taxes.
Resumo:
This paper examines the antecedents and innovation consequences of the methods firms adopt in organizing their search strategies. From a theoretical perspective, organizational search is described using a typology that shows how firms implement exploration and exploitation search activities that span their organizational boundaries. This typology includes three models of implementation: ambidextrous, specialized, and diversified implementation. From an empirical perspective, the paper examines the performance consequences when applying these models, and compares their capacity to produce complementarities. Additionally, since firms' choices in matters of organizational search are viewed as endogenous variables, the paper examines the drivers affecting them and identifies the importance of firms' absorptive capacity and diversified technological opportunities in determining these choices. The empirical design of the paper draws on new data for manufacturing firms in Spain, surveyed between 2003 and 2006.
Resumo:
A rational method of search for natural neolignans of desired structures is outlined. This involves consultation of a collection of chemical profiles of plant families. The profiles are assembled considering the biosynthetic class (in the present case lignoids), subclass (neolignans), structural types (neolignan skeleta) and relative frequency of substitutional derivatives belonging to each type (known compounds). The method is of course applicable to ani class of natural products. Its use in the case of neolignans is here selected as an exemple in view of the recently discovered antagonism towards PAF of kadsurenone, a representative of this subclass of phytochemicals. Application of the chemical profiles to phylogenetic studies is illustrated.