977 resultados para Search image
Resumo:
The interfaces between the intrapsychic, interactional, and intergenerational domains are a new frontier. As a pilot, we exposed ourselves to a complex but controllable situation as viewed by people whose main interest is in one of the three interfaces; we also fully integrated the subjects in the team, to learn about their subjective perspectives and to provide them with an enriching experience. We started with a brief "triadification" sequence (i.e., moving from a "two plus one" to a "three together" family organization). Considering this sequence as representing at a micro level many larger family transitions, we proceeded with a microanalytic interview, a psychodynamic investigation, and a family interview. As expected, larger patterns of correspondences are emerging. Central questions under debate are: What are the most appropriate units at each level of description and what are their articulations between these levels? What is the status of "triadification"? Les interfaces entre les domaines intrapsychiques, interactionnels et intergénérationnels représentent une nouvelle frontiére. A titre exploratoire, nous nous sommes exposés à une situation complexe mais contrǒlable ainsi que le voient ceux dont I'intérět principal se porte sur l'une de ces trois interfaces. Nous avons aussi entièrement intégré les sujets dans l'équipe, de facon à comprendre leur perspective subjective et à leur offrir une expérience enrichissante. Nous avons commencé avec une brève séquence de "triadification," c'est-à-dire passer d'une organisation familiale "deux plus un" à Ltne organisation familiale "trois (add sentenc)ensemble." Considérant cette séquence comme representative à un niveau microscopique de transitions familiales bien plus larges, nous avons procedé à l'entretien microanalytique, à une enquěte psychodynamique et à un entretien familial. Comme prévu, de grands patterns de correspondances émergent. Les questions essentielles sur lesquelles portent le débat sont: quelles les unités les plus appropiées à chaque niveau de description et quelles sont les articulations entre ces niveaux? Quel est le statut de la "triadification"?
Resumo:
A growing literature integrates theories of debt management into models of optimal fiscal policy. One promising theory argues that the composition of government debt should be chosen so that fluctuations in the market value of debt offset changes in expected future deficits. This complete market approach to debt management is valid even when the government only issues non-contingent bonds. A number of authors conclude from this approach that governments should issue long term debt and invest in short term assets. We argue that the conclusions of this approach are too fragile to serve as a basis for policy recommendations. This is because bonds at different maturities have highly correlated returns, causing the determination of the optimal portfolio to be ill-conditioned. To make this point concrete we examine the implications of this approach to debt management in various models, both analytically and using numerical methods calibrated to the US economy. We find the complete market approach recommends asset positions which are huge multiples of GDP. Introducing persistent shocks or capital accumulation only worsens this problem. Increasing the volatility of interest rates through habits partly reduces the size of these simulations we find no presumption that governments should issue long term debt ? policy recommendations can be easily reversed through small perturbations in the specification of shocks or small variations in the maturity of bonds issued. We further extend the literature by removing the assumption that governments every period costlessly repurchase all outstanding debt. This exacerbates the size of the required positions, worsens their volatility and in some cases produces instability in debt holdings. We conclude that it is very difficult to insulate fiscal policy from shocks by using the complete markets approach to debt management. Given the limited variability of the yield curve using maturities is a poor way to substitute for state contingent debt. The result is the positions recommended by this approach conflict with a number of features that we believe are important in making bond markets incomplete e.g allowing for transaction costs, liquidity effects, etc.. Until these features are all fully incorporated we remain in search of a theory of debt management capable of providing robust policy insights.
Resumo:
In order to upgrade the reliability of xenodiagnosis, attention has been directed towards population dynamics of the parasite, with particular interest for the following factors: 1. Parasite density which by itself is not a research objective, but by giving an accurate portrayal of parasite development and multiplication, has been incorporated in screening of bugs for xenodiagnosis. 2. On the assumption that food availability might increase parasite density, bugs from xenodiagnosis have been refed at biweekly intervals on chicken blood. 3. Infectivity rates and positives harbouring large parasite yields were based on gut infections, in which the parasite population comprised of all developmental forms was more abundant and easier to detect than in fecal infections, thus minimizing the probability of recording false negatives. 4. Since parasite density, low in the first 15 days of infection, increases rapidly in the following 30 days, the interval of 45 days has been adopted for routine examination of bugs from xenodiagnosis. By following the enumerated measures, all aiming to reduce false negative cases, we are getting closer to a reliable xenodiagnostic procedure. Upgrading the efficacy of xenodiagnosis is also dependent on the xenodiagnostic agent. Of 9 investigated vector species, Panstrongylus megistus deserves top priority as a xenodiagnostic agent. Its extraordinary capability to support fast development and vigorous multiplication of the few parasites, ingested from the host with chronic Chagas' disease, has been revealed by the strikingly close infectivity rates of 91.2% vs. 96.4% among bugs engorged from the same host in the chronic and acute phase of the disease respectively (Table V), the latter comporting an estimated number of 12.3 x 10[raised to the power of 3] parasites in the circulation at the time of xenodiagnosis, as reported previously by the authors (1982).
Resumo:
The recent findings on immunodiagnosis of schistosomiasis mansoni have shown that purified Schistosoma mansoni antigens do not provide maximum positivity. Therefore, the authors suggest the use of semi-purified antigens for diagnostic purposes. So far, no serological marker for cured patients as shown by negative stool examination was found. However, a tendency of IgG antibody titre decrease was observed, when egg antigen was used.
Resumo:
This paper presents a semisupervised support vector machine (SVM) that integrates the information of both labeled and unlabeled pixels efficiently. Method's performance is illustrated in the relevant problem of very high resolution image classification of urban areas. The SVM is trained with the linear combination of two kernels: a base kernel working only with labeled examples is deformed by a likelihood kernel encoding similarities between labeled and unlabeled examples. Results obtained on very high resolution (VHR) multispectral and hyperspectral images show the relevance of the method in the context of urban image classification. Also, its simplicity and the few parameters involved make the method versatile and workable by unexperienced users.
Resumo:
This paper develops methods for Stochastic Search Variable Selection (currently popular with regression and Vector Autoregressive models) for Vector Error Correction models where there are many possible restrictions on the cointegration space. We show how this allows the researcher to begin with a single unrestricted model and either do model selection or model averaging in an automatic and computationally efficient manner. We apply our methods to a large UK macroeconomic model.
Resumo:
CODEX SEARCH es un motor de recuperación de información especializado en derecho de extranjería que está basado en herramientas y conocimiento lingüísticos. Un motor o Sistema de Recuperación de Información (SRI) es un software capaz de localizar información en grandes colecciones documentales (entorno no trivial) en formato electrónico. Mediante un estudio previo se ha detectado que la extranjería es un ámbito discursivo en el que resulta difícil expresar la necesidad de información en términos de una consulta formal, objeto de los sistemas de recuperación actuales. Por lo tanto, para desarrollar un SRI eficiente en el dominio indicado no basta con emplear un modelo tradicional de RI, es decir, comparar los términos de la pregunta con los de la respuesta, básicamente porque no expresan implicaciones y porque no tiene que haber necesariamente una relación 1 a 1. En este sentido, la solución lingüística propuesta se basa en incorporar el conocimiento del especialista mediante la integración en el sistema de una librería de casos. Los casos son ejemplos de procedimientos aplicados por expertos a la solución de problemas que han ocurrido en la realidad y que han terminado en éxito o fracaso. Los resultados obtenidos en esta primera fase son muy alentadores pero es necesario continuar la investigación en este campo para mejorar el rendimiento del prototipo al que se puede acceder desde &http://161.116.36.139/~codex/&.
Resumo:
This paper shows how one of the developers of QWERTY continued to use the trade secret that underlay its development to seek further efficiency improvements after its introduction. It provides further evidence that this was the principle used to design QWERTY in the first place and adds further weight to arguments that QWERTY itself was a consequence of creative design and an integral part of a highly efficient system rather than an accident of history. This further serves to raise questions over QWERTY's forced servitude as 'paradigm case' of inferior standard in the path dependence literature. The paper also shows how complementarities in forms of intellectual property rights protection played integral roles in the development of QWERTY and the search for improvements on it, and also helped effectively conceal the source of the efficiency advantages that QWERTY helped deliver.
Resumo:
We consider a frictional two-sided matching market in which one side uses public cheap talk announcements so as to attract the other side. We show that if the first-price auction is adopted as the trading protocol, then cheap talk can be perfectly informative, and the resulting market outcome is efficient, constrained only by search frictions. We also show that the performance of an alternative trading protocol in the cheap-talk environment depends on the level of price dispersion generated by the protocol: If a trading protocol compresses (spreads) the distribution of prices relative to the first-price auction, then an efficient fully revealing equilibrium always (never) exists. Our results identify the settings in which cheap talk can serve as an efficient competitive instrument, in the sense that the central insights from the literature on competing auctions and competitive search continue to hold unaltered even without ex ante price commitment.
Resumo:
In a market in which sellers compete by posting mechanisms, we study how the properties of the meeting technology affect the mechanism that sellers select. In general, sellers have incentive to use mechanisms that are socially efficient. In our environment, sellers achieve this by posting an auction with a reserve price equal to their own valuation, along with a transfer that is paid by (or to) all buyers with whom the seller meets. However, we define a novel condition on meeting technologies, which we call “invariance,” and show that the transfer is equal to zero if and only if the meeting technology satisfies this condition.
Resumo:
We develop a life-cycle model of the labor market in which different worker-firm matches have different quality and the assignment of the right workers to the right firms is time consuming because of search and learning frictions. The rate at which workers move between unemployment, employment and across different firms is endogenous because search is directed and, hence, workers can choose whether to seek low-wage jobs that are easy to find or high-wage jobs that are hard to find. We calibrate our theory using data on labor market transitions aggregated across workers of different ages. We validate our theory by showing that it predicts quite well the pattern of labor market transitions for workers of different ages. Finally, we use our theory to decompose the age profiles of transition rates, wages and productivity into the effects of age variation in work-life expectancy, human capital and match quality.