33 resultados para automatic affect analysis
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
This paper describes the main features and present results of MPRO-Spanish, a parser for morphological and syntactic analysis of unrestricted Spanish text developed at the IAI1. This parser makes direct use of X-phrase structure rules to handle a variety of patterns from derivational morphology and syntactic structure. Both analyses, morphological and syntactic, are realised by two subsequent modules. One module analyses and disambiguates the source words at morphological level while the other consists of a series of programs and a deterministic, procedural and explicit grammar. The article explains the main features of MPRO and resumes some of the experiments on some of its applications, some of which still being implemented like the monolingual and bilingual term extraction while others need further work like indexing. The results and applications obtained so far with simple and relatively complex sentences give us grounds to believe in its reliability.
Resumo:
In recent years traditional inequality measures have been used to quite a considerable extent to examine the international distribution of environmental indicators. One of its main characteristics is that each one assigns different weights to the changes that occur in the different sections of the variable distribution and, consequently, the results they yield can potentially be very different. Hence, we suggest the appropriateness of using a range of well-recommended measures to achieve more robust results. We also provide an empirical test for the comparative behaviour of several suitable inequality measures and environmental indicators. Our findings support the hypothesis that in some cases there are differences among measures in both the sign of the evolution and its size. JEL codes: D39; Q43; Q56. Keywords: international environment factor distribution; Kaya factors; Inequality measurement
Resumo:
Ground clutter caused by anomalous propagation (anaprop) can affect seriously radar rain rate estimates, particularly in fully automatic radar processing systems, and, if not filtered, can produce frequent false alarms. A statistical study of anomalous propagation detected from two operational C-band radars in the northern Italian region of Emilia Romagna is discussed, paying particular attention to its diurnal and seasonal variability. The analysis shows a high incidence of anaprop in summer, mainly in the morning and evening, due to the humid and hot summer climate of the Po Valley, particularly in the coastal zone. Thereafter, a comparison between different techniques and datasets to retrieve the vertical profile of the refractive index gradient in the boundary layer is also presented. In particular, their capability to detect anomalous propagation conditions is compared. Furthermore, beam path trajectories are simulated using a multilayer ray-tracing model and the influence of the propagation conditions on the beam trajectory and shape is examined. High resolution radiosounding data are identified as the best available dataset to reproduce accurately the local propagation conditions, while lower resolution standard TEMP data suffers from interpolation degradation and Numerical Weather Prediction model data (Lokal Model) are able to retrieve a tendency to superrefraction but not to detect ducting conditions. Observing the ray tracing of the centre, lower and upper limits of the radar antenna 3-dB half-power main beam lobe it is concluded that ducting layers produce a change in the measured volume and in the power distribution that can lead to an additional error in the reflectivity estimate and, subsequently, in the estimated rainfall rate.
Resumo:
Background Accurate automatic segmentation of the caudate nucleus in magnetic resonance images (MRI) of the brain is of great interest in the analysis of developmental disorders. Segmentation methods based on a single atlas or on multiple atlases have been shown to suitably localize caudate structure. However, the atlas prior information may not represent the structure of interest correctly. It may therefore be useful to introduce a more flexible technique for accurate segmentations. Method We present Cau-dateCut: a new fully-automatic method of segmenting the caudate nucleus in MRI. CaudateCut combines an atlas-based segmentation strategy with the Graph Cut energy-minimization framework. We adapt the Graph Cut model to make it suitable for segmenting small, low-contrast structures, such as the caudate nucleus, by defining new energy function data and boundary potentials. In particular, we exploit information concerning the intensity and geometry, and we add supervised energies based on contextual brain structures. Furthermore, we reinforce boundary detection using a new multi-scale edgeness measure. Results We apply the novel CaudateCut method to the segmentation of the caudate nucleus to a new set of 39 pediatric attention-deficit/hyperactivity disorder (ADHD) patients and 40 control children, as well as to a public database of 18 subjects. We evaluate the quality of the segmentation using several volumetric and voxel by voxel measures. Our results show improved performance in terms of segmentation compared to state-of-the-art approaches, obtaining a mean overlap of 80.75%. Moreover, we present a quantitative volumetric analysis of caudate abnormalities in pediatric ADHD, the results of which show strong correlation with expert manual analysis. Conclusion CaudateCut generates segmentation results that are comparable to gold-standard segmentations and which are reliable in the analysis of differentiating neuroanatomical abnormalities between healthy controls and pediatric ADHD.
Resumo:
The work presented here is part of a larger study to identify novel technologies and biomarkers for early Alzheimer disease (AD) detection and it focuses on evaluating the suitability of a new approach for early AD diagnosis by non-invasive methods. The purpose is to examine in a pilot study the potential of applying intelligent algorithms to speech features obtained from suspected patients in order to contribute to the improvement of diagnosis of AD and its degree of severity. In this sense, Artificial Neural Networks (ANN) have been used for the automatic classification of the two classes (AD and control subjects). Two human issues have been analyzed for feature selection: Spontaneous Speech and Emotional Response. Not only linear features but also non-linear ones, such as Fractal Dimension, have been explored. The approach is non invasive, low cost and without any side effects. Obtained experimental results were very satisfactory and promising for early diagnosis and classification of AD patients.
Resumo:
Alzheimer's disease is the most prevalent form of progressive degenerative dementia; it has a high socio-economic impact in Western countries. Therefore it is one of the most active research areas today. Alzheimer's is sometimes diagnosed by excluding other dementias, and definitive confirmation is only obtained through a post-mortem study of the brain tissue of the patient. The work presented here is part of a larger study that aims to identify novel technologies and biomarkers for early Alzheimer's disease detection, and it focuses on evaluating the suitability of a new approach for early diagnosis of Alzheimer’s disease by non-invasive methods. The purpose is to examine, in a pilot study, the potential of applying Machine Learning algorithms to speech features obtained from suspected Alzheimer sufferers in order help diagnose this disease and determine its degree of severity. Two human capabilities relevant in communication have been analyzed for feature selection: Spontaneous Speech and Emotional Response. The experimental results obtained were very satisfactory and promising for the early diagnosis and classification of Alzheimer’s disease patients.
Resumo:
Alzheimer’s disease (AD) is the most prevalent form of progressive degenerative dementia and it has a high socio-economic impact in Western countries, therefore is one of the most active research areas today. Its diagnosis is sometimes made by excluding other dementias, and definitive confirmation must be done trough a post-mortem study of the brain tissue of the patient. The purpose of this paper is to contribute to im-provement of early diagnosis of AD and its degree of severity, from an automatic analysis performed by non-invasive intelligent methods. The methods selected in this case are Automatic Spontaneous Speech Analysis (ASSA) and Emotional Temperature (ET), that have the great advantage of being non invasive, low cost and without any side effects.
Resumo:
Alzheimer’s disease (AD) is the most prevalent form of progressive degenerative dementia and it has a high socio-economic impact in Western countries, therefore is one of the most active research areas today. Its diagnosis is sometimes made by excluding other dementias, and definitive confirmation must be done trough a post-mortem study of the brain tissue of the patient. The purpose of this paper is to contribute to improvement of early diagnosis of AD and its degree of severity, from an automatic analysis performed by non-invasive intelligent methods. The methods selected in this case are Automatic Spontaneous Speech Analysis (ASSA) and Emotional Temperature (ET), that have the great advantage of being non invasive, low cost and without any side effects.
Resumo:
Alzheimer׳s disease (AD) is the most common type of dementia among the elderly. This work is part of a larger study that aims to identify novel technologies and biomarkers or features for the early detection of AD and its degree of severity. The diagnosis is made by analyzing several biomarkers and conducting a variety of tests (although only a post-mortem examination of the patients’ brain tissue is considered to provide definitive confirmation). Non-invasive intelligent diagnosis techniques would be a very valuable diagnostic aid. This paper concerns the Automatic Analysis of Emotional Response (AAER) in spontaneous speech based on classical and new emotional speech features: Emotional Temperature (ET) and fractal dimension (FD). This is a pre-clinical study aiming to validate tests and biomarkers for future diagnostic use. The method has the great advantage of being non-invasive, low cost, and without any side effects. The AAER shows very promising results for the definition of features useful in the early diagnosis of AD.
Resumo:
Extending the traditional input-output model to account for the environmental impacts of production processes reveals the channels by which environmental burdens are transmitted throughout the economy. In particular, the environmental input-output approach is a useful technique for quantifying the changes in the levels of greenhouse emissions caused by changes in the final demand for production activities. The inputoutput model can also be used to determine the changes in the relative composition of greenhouse gas emissions due to exogenous inflows. In this paper we describe a method for evaluating how the exogenous changes in sectorial demand, such as changes in private consumption, public consumption, investment and exports, affect the relative contribution of the six major greenhouse gases regulated by the Kyoto Protocol to total greenhouse emissions. The empirical application is for Spain, and the economic and environmental data are for the year 2000. Our results show that there are significant differences in the effects of different sectors on the composition of greenhouse emissions. Therefore, the final impact on the relative contribution of pollutants will basically depend on the activity that receives the exogenous shock in final demand, because there are considerable differences in the way, and the extent to which, individual activities affect the relative composition of greenhouse gas emissions. Keywords: Greenhouse emissions, composition of emissions, sectorial demand, exogenous shock.
Resumo:
We study whether people's behavior in unbalanced gift exchange markets with repeated interaction are affected by whether they are on the excess supply side or the excess demand side of the market. Our analysis is based on the comparison of behavior between two types of experimental gift exchange markets, which vary only with respect to whether first or second movers are on the long side of the market. The direction of market imbalance could influence subjects' behavior, as second movers (workers) might react differently to favorable actions by first movers (firms) in the two cases. While our data show strong deviations from the standard game-theoretic prediction, we find mainly secondary treatment effects. Wage offers are not higher when there is an excess supply of firms, and workers do not respond more favorably to a given wage when there is an excess supply of labor. The state of competition does not appear to have strong effects in our data. We also present data from single-period sessions that show substantial gift exchange even without repeated interactions.
Resumo:
El problema de controlar les emissions de televisió digital a tota Europa pel desenvolupament de receptors robustos i fiables és cada vegada més significant, per això, sorgeix la necessitat d’automatitzar el procés d’anàlisi i control d’aquests senyals. Aquest projecte presenta el desenvolupament software d’una aplicació que vol solucionar una part d’aquest problema. L’aplicació s’encarrega d’analitzar, gestionar i capturar senyals de televisió digital. Aquest document fa una introducció a la matèria central que és la televisió digital i la informació que porten els senyals de televisió, concretament, la que es refereix a l’estàndard "Digital Video Broadcasting". A continuació d’aquesta part, l’escrit es concentra en l’explicació i descripció de les funcionalitats que necessita cobrir l'aplicació, així com introduir i explicar cada etapa d’un procés de desenvolupament software. Finalment, es resumeixen els avantatges de la creació d’aquest programa per l’automatització de l’anàlisi de senyal digital partint d’una optimització de recursos.
Resumo:
We investigate the determinants of teamwork and workers cooperation within the firm. Up to now the literature has almost exclusively focused on workers incentives as the main determinants for workers cooperation. We take a broader look at the firm's organizational design and analyze the impact that different aspects of it might have on cooperation. In particular, we consider the way in which the degree of decentralization of decisions and the use of complementary HRM practices (what we call the .rm.s vertical organizational design) can affect workers'collaboration with each other. We test the model's predictions on a unique dataset on Spanish small and medium size firms containing a rich set of variables that allows us to use sensible proxies for workers cooperation. We find that the decentralization of labor decisions (and to a less extent that of task planning) has a positive impact on workers cooperation. Likewise, cooperation is positively correlated to many of the HRM practices that seem to favor workers'interaction the most. We also confirm the previous finding that collaborative efforts respond positively to pay incentives, and particularly, to group or company incentives.
Resumo:
This paper presents a tool for the analysis and regeneration of Web contents, implemented through XML and Java. At the moment, the Web content delivery from server to clients is carried out without taking into account clients' characteristics. Heterogeneous and diverse characteristics, such as user's preferences, different capacities of the client's devices, different types of access, state of the network and current load on the server, directly affect the behavior of Web services. On the other hand, the growing use of multimedia objects in the design of Web contents is made without taking into account this diversity and heterogeneity. It affects, even more, the appropriate content delivery. Thus, the objective of the presented tool is the treatment of Web pages taking into account the mentioned heterogeneity and adapting contents in order to improve the performance on the Web
Resumo:
A recent trend in digital mammography is computer-aided diagnosis systems, which are computerised tools designed to assist radiologists. Most of these systems are used for the automatic detection of abnormalities. However, recent studies have shown that their sensitivity is significantly decreased as the density of the breast increases. This dependence is method specific. In this paper we propose a new approach to the classification of mammographic images according to their breast parenchymal density. Our classification uses information extracted from segmentation results and is based on the underlying breast tissue texture. Classification performance was based on a large set of digitised mammograms. Evaluation involves different classifiers and uses a leave-one-out methodology. Results demonstrate the feasibility of estimating breast density using image processing and analysis techniques