47 resultados para Automatic adjustment
Resumo:
The work we present here addresses cue-based noun classification in English and Spanish. Its main objective is to automatically acquire lexical semantic information by classifying nouns into previously known noun lexical classes. This is achieved by using particular aspects of linguistic contexts as cues that identify a specific lexical class. Here we concentrate on the task of identifying such cues and the theoretical background that allows for an assessment of the complexity of the task. The results show that, despite of the a-priori complexity of the task, cue-based classification is a useful tool in the automatic acquisition of lexical semantic classes.
Resumo:
Automatic creation of polarity lexicons is a crucial issue to be solved in order to reduce time andefforts in the first steps of Sentiment Analysis. In this paper we present a methodology based onlinguistic cues that allows us to automatically discover, extract and label subjective adjectivesthat should be collected in a domain-based polarity lexicon. For this purpose, we designed abootstrapping algorithm that, from a small set of seed polar adjectives, is capable to iterativelyidentify, extract and annotate positive and negative adjectives. Additionally, the methodautomatically creates lists of highly subjective elements that change their prior polarity evenwithin the same domain. The algorithm proposed reached a precision of 97.5% for positiveadjectives and 71.4% for negative ones in the semantic orientation identification task.
Resumo:
Lexical Resources are a critical component for Natural Language Processing applications. However, the high cost of comparing and merging different resources has been a bottleneck to have richer resources with a broad range of potential uses for a significant number of languages.With the objective of reducing cost byeliminating human intervention, we present a new method for automating the merging of resources,with special emphasis in what we call the mapping step. This mapping step, which converts the resources into a common format that allows latter the merging, is usually performed with huge manual effort and thus makes the whole process very costly. Thus, we propose a method to perform this mapping fully automatically. To test our method, we have addressed the merging of two verb subcategorization frame lexica for Spanish, The resultsachieved, that almost replicate human work, demonstrate the feasibility of the approach.
Resumo:
In this work we present the results of experimental work on the development of lexical class-based lexica by automatic means. Our purpose is to assess the use of linguistic lexical-class based information as a feature selection methodology for the use of classifiers in quick lexical development. The results show that the approach can help reduce the human effort required in the development of language resources significantly.
Resumo:
Lexical Resources are a critical component for Natural Language Processing applications. However, the high cost of comparing and merging different resources has been a bottleneck to obtain richer resources and a broader range of potential uses for a significant number of languages. With the objective of reducing cost by eliminating human intervention, we present a new method towards the automatic merging of resources. This method includes both, the automatic mapping of resources involved to a common format and merging them, once in this format. This paper presents how we have addressed the merging of two verb subcategorization frame lexica for Spanish, but our method will be extended to cover other types of Lexical Resources. The achieved results, that almost replicate human work, demonstrate the feasibility of the approach.
Resumo:
Does financial development result in capital being reallocated more rapidly to industries where it is most productive? We argue that if this was the case, financially developed countries should see faster growth in industries with investment opportunities due to global demand and productivity shifts. Testing this cross-industry cross-country growth implication requires proxies for (latent) global industry investment opportunities. We show that tests relying only on data from specific (benchmark) countries may yield spurious evidence for or against the hypothesis. We therefore develop an alternative approach that combines benchmark-country proxies with a proxy that does not reflect opportunities specific to a country or level of financial development. Our empirical results yield clear support for the capital reallocation hypothesis.
Resumo:
Using new quarterly data for hours worked in OECD countries, Ohanian and Raffo (2011) argue that in many OECD countries, particularly in Europe, hours per worker are quantitatively important as an intensive margin of labor adjustment, possibly because labor market frictions are higher than in the US. I argue that this conclusion is not supported by the data. Using the same data on hours worked, I find evidence that labor market frictions are higher in Europe than in the US, like Ohanian and Raffo, but also that these frictions seem to affect the intensive margin at least as much as the extensive margin of labor adjustment.
Resumo:
The emphasis on integrated care implies new incentives that promote coordinationbetween levels of care. Considering a population as a whole, the resource allocation systemhas to adapt to this environment. This research is aimed to design a model that allows formorbidity related prospective and concurrent capitation payment. The model can be applied inpublicly funded health systems and managed competition settings.Methods: We analyze the application of hybrid risk adjustment versus either prospective orconcurrent risk adjustment formulae in the context of funding total health expenditures for thepopulation of an integrated healthcare delivery organization in Catalonia during years 2004 and2005.Results: The hybrid model reimburses integrated care organizations avoiding excessive risktransfer and maximizing incentives for efficiency in the provision. At the same time, it eliminatesincentives for risk selection for a specific set of high risk individuals through the use ofconcurrent reimbursement in order to assure a proper classification of patients.Conclusion: Prospective Risk Adjustment is used to transfer the financial risk to the healthprovider and therefore provide incentives for efficiency. Within the context of a National HealthSystem, such transfer of financial risk is illusory, and the government has to cover the deficits.Hybrid risk adjustment is useful to provide the right combination of incentive for efficiency andappropriate level of risk transfer for integrated care organizations.
Resumo:
Does financial development result in capital being reallocated more rapidly to industries where it is most productive? We argue that if this was the case, financially developed countries should see faster growth in industries with investment opportunities due to global demand and productivity shifts. Testing this cross-industry cross-country growth implication requires proxies for (latent) global industry investment opportunities. We show that tests relying only on data from specific (benchmark) countries may yield spurious evidence for or against the hypothesis. We therefore develop an alternative approach that combines benchmark-country proxies with a proxy that does not reflect opportunities specific to a country or level of financial development. Our empirical results yield clear support for the capital reallocation hypothesis.
Resumo:
This paper analyses the application of hybrid risk adjustment versus either prospective orconcurrent risk adjustment formulae in the context of funding pharmaceutical benefits for thepopulation of an integrated healthcare delivery organization in Catalonia during years 2002 and2003. We apply a mixed formula and find that a hybrid risk adjustment model increasesincentives for efficiency in the provision of low risk individuals at health organizations not only asa whole but also at each internal department compared to only prospective models by reducingwithin-group variation of drug expenditures.
Resumo:
We investigate the hypothesis that the atmosphere is constrained to maximize its entropy production by using a one-dimensional (1-D) vertical model. We prescribe the lapse rate in the convective layer as that of the standard troposphere. The assumption that convection sustains a critical lapse rate was absent in previous studies, which focused on the vertical distribution of climatic variables, since such a convective adjustment reduces the degrees of freedom of the system and may prevent the application of the maximum entropy production (MEP) principle. This is not the case in the radiative–convective model (RCM) developed here, since we accept a discontinuity of temperatures at the surface similar to that adopted in many RCMs. For current conditions, the MEP state gives a difference between the ground temperature and the air temperature at the surface ≈10 K. In comparison, conventional RCMs obtain a discontinuity ≈2 K only. However, the surface boundary layer velocity in the MEP state appears reasonable (≈3 m s-¹). Moreover, although the convective flux at the surface in MEP states is almost uniform in optically thick atmospheres, it reaches a maximum value for an optical thickness similar to current conditions. This additional result may support the maximum convection hypothesis suggested by Paltridge (1978)
Resumo:
[spa] El objetivo de este trabajo es analizar si los municipios españoles se ajustan en presencia de un shock presupuestario y (si es así) qué elementos del presupuesto son los que realizan el ajuste. La metodología utilizada para contestar estas preguntas es un mecanismo de corrección del error, VECM, que estimamos con un panel de datos de los municipios españoles durante el período 1988-2006. Nuestros resultados confirman que, en primer lugar, los municipios se ajustan en presencia de un shock fiscal (es decir, el déficit es estacionario en el largo plazo). En segundo lugar, obtenemos que cuando el shock afecta a los ingresos el ajuste lo soporta principalmente el municipio reduciendo el gasto, las transferencias tienen un papel muy reducido en este proceso de ajuste. Por el contrario, cuando el shock afecta al gasto, el ajuste es compartido en términos similares entre el municipio – incrementado los impuestos – y los gobiernos de niveles superiores – incrementando las transferencias. Estos resultados sugieren que la viabilidad de las finanzas pública locales es factible con diferentes entornos institucionales.
Resumo:
[spa] El objetivo de este trabajo es analizar si los municipios españoles se ajustan en presencia de un shock presupuestario y (si es así) qué elementos del presupuesto son los que realizan el ajuste. La metodología utilizada para contestar estas preguntas es un mecanismo de corrección del error, VECM, que estimamos con un panel de datos de los municipios españoles durante el período 1988-2006. Nuestros resultados confirman que, en primer lugar, los municipios se ajustan en presencia de un shock fiscal (es decir, el déficit es estacionario en el largo plazo). En segundo lugar, obtenemos que cuando el shock afecta a los ingresos el ajuste lo soporta principalmente el municipio reduciendo el gasto, las transferencias tienen un papel muy reducido en este proceso de ajuste. Por el contrario, cuando el shock afecta al gasto, el ajuste es compartido en términos similares entre el municipio – incrementado los impuestos – y los gobiernos de niveles superiores – incrementando las transferencias. Estos resultados sugieren que la viabilidad de las finanzas pública locales es factible con diferentes entornos institucionales.
Resumo:
Background Accurate automatic segmentation of the caudate nucleus in magnetic resonance images (MRI) of the brain is of great interest in the analysis of developmental disorders. Segmentation methods based on a single atlas or on multiple atlases have been shown to suitably localize caudate structure. However, the atlas prior information may not represent the structure of interest correctly. It may therefore be useful to introduce a more flexible technique for accurate segmentations. Method We present Cau-dateCut: a new fully-automatic method of segmenting the caudate nucleus in MRI. CaudateCut combines an atlas-based segmentation strategy with the Graph Cut energy-minimization framework. We adapt the Graph Cut model to make it suitable for segmenting small, low-contrast structures, such as the caudate nucleus, by defining new energy function data and boundary potentials. In particular, we exploit information concerning the intensity and geometry, and we add supervised energies based on contextual brain structures. Furthermore, we reinforce boundary detection using a new multi-scale edgeness measure. Results We apply the novel CaudateCut method to the segmentation of the caudate nucleus to a new set of 39 pediatric attention-deficit/hyperactivity disorder (ADHD) patients and 40 control children, as well as to a public database of 18 subjects. We evaluate the quality of the segmentation using several volumetric and voxel by voxel measures. Our results show improved performance in terms of segmentation compared to state-of-the-art approaches, obtaining a mean overlap of 80.75%. Moreover, we present a quantitative volumetric analysis of caudate abnormalities in pediatric ADHD, the results of which show strong correlation with expert manual analysis. Conclusion CaudateCut generates segmentation results that are comparable to gold-standard segmentations and which are reliable in the analysis of differentiating neuroanatomical abnormalities between healthy controls and pediatric ADHD.