997 resultados para Computational Lexical Semantics
Resumo:
Understanding the basis on which recruiters form hirability impressions for a job applicant is a key issue in organizational psychology and can be addressed as a social computing problem. We approach the problem from a face-to-face, nonverbal perspective where behavioral feature extraction and inference are automated. This paper presents a computational framework for the automatic prediction of hirability. To this end, we collected an audio-visual dataset of real job interviews where candidates were applying for a marketing job. We automatically extracted audio and visual behavioral cues related to both the applicant and the interviewer. We then evaluated several regression methods for the prediction of hirability scores and showed the feasibility of conducting such a task, with ridge regression explaining 36.2% of the variance. Feature groups were analyzed, and two main groups of behavioral cues were predictive of hirability: applicant audio features and interviewer visual cues, showing the predictive validity of cues related not only to the applicant, but also to the interviewer. As a last step, we analyzed the predictive validity of psychometric questionnaires often used in the personnel selection process, and found that these questionnaires were unable to predict hirability, suggesting that hirability impressions were formed based on the interaction during the interview rather than on questionnaire data.
Resumo:
Although approximately 50% of Down Syndrome (DS) patients have heart abnormalities, they exhibit an overprotection against cardiac abnormalities related with the connective tissue, for example a lower risk of coronary artery disease. A recent study reported a case of a person affected by DS who carried mutations in FBN1, the gene causative for a connective tissue disorder called Marfan Syndrome (MFS). The fact that the person did not have any cardiac alterations suggested compensation effects due to DS. This observation is supported by a previous DS meta-analysis at the molecular level where we have found an overall upregulation of FBN1 (which is usually downregulated in MFS). Additionally, that result was cross-validated with independent expression data from DS heart tissue. The aim of this work is to elucidate the role of FBN1 in DS and to establish a molecular link to MFS and MFS-related syndromes using a computational approach. To reach that, we conducted different analytical approaches over two DS studies (our previous meta-analysis and independent expression data from DS heart tissue) and revealed expression alterations in the FBN1 interaction network, in FBN1 co-expressed genes and FBN1-related pathways. After merging the significant results from different datasets with a Bayesian approach, we prioritized 85 genes that were able to distinguish control from DS cases. We further found evidence for several of these genes (47%), such as FBN1, DCN, and COL1A2, being dysregulated in MFS and MFS-related diseases. Consequently, we further encourage the scientific community to take into account FBN1 and its related network for the study of DS cardiovascular characteristics.
Resumo:
Top-down contextual influences play a major part in speech understanding, especially in hearing-impaired patients with deteriorated auditory input. Those influences are most obvious in difficult listening situations, such as listening to sentences in noise but can also be observed at the word level under more favorable conditions, as in one of the most commonly used tasks in audiology, i.e., repeating isolated words in silence. This study aimed to explore the role of top-down contextual influences and their dependence on lexical factors and patient-specific factors using standard clinical linguistic material. Spondaic word perception was tested in 160 hearing-impaired patients aged 23-88 years with a four-frequency average pure-tone threshold ranging from 21 to 88 dB HL. Sixty spondaic words were randomly presented at a level adjusted to correspond to a speech perception score ranging between 40 and 70% of the performance intensity function obtained using monosyllabic words. Phoneme and whole-word recognition scores were used to calculate two context-influence indices (the j factor and the ratio of word scores to phonemic scores) and were correlated with linguistic factors, such as the phonological neighborhood density and several indices of word occurrence frequencies. Contextual influence was greater for spondaic words than in similar studies using monosyllabic words, with an overall j factor of 2.07 (SD = 0.5). For both indices, context use decreased with increasing hearing loss once the average hearing loss exceeded 55 dB HL. In right-handed patients, significantly greater context influence was observed for words presented in the right ears than for words presented in the left, especially in patients with many years of education. The correlations between raw word scores (and context influence indices) and word occurrence frequencies showed a significant age-dependent effect, with a stronger correlation between perception scores and word occurrence frequencies when the occurrence frequencies were based on the years corresponding to the patients' youth, showing a "historic" word frequency effect. This effect was still observed for patients with few years of formal education, but recent occurrence frequencies based on current word exposure had a stronger influence for those patients, especially for younger ones.
Resumo:
It is often assumed that total head losses in a sand filter are solely due to the filtration media and that there are analytical solutions, such as the Ergun equation, to compute them. However, total head losses are also due to auxiliary elements (inlet and outlet pipes and filter nozzles), which produce undesirable head losses because they increase energy requirements without contributing to the filtration process. In this study, ANSYS Fluent version 6.3, a commercial computational fluid dynamics (CFD) software program, was used to compute head losses in different parts of a sand filter. Six different numerical filter models of varying complexities were used to understand the hydraulic behavior of the several filter elements and their importance in total head losses. The simulation results show that 84.6% of these were caused by the sand bed and 15.4% were due to auxiliary elements (4.4% in the outlet and inlet pipes, and 11.0% in the perforated plate and nozzles). Simulation results with different models show the important role of the nozzles in the hydraulic behavior of the sand filter. The relationship between the passing area through the nozzles and the passing area through the perforated plate is an important design parameter for the reduction of total head losses. A reduced relationship caused by nozzle clogging would disproportionately increase the total head losses in the sand filter
Resumo:
The system described herein represents the first example of a recommender system in digital ecosystems where agents negotiate services on behalf of small companies. The small companies compete not only with price or quality, but with a wider service-by-service composition by subcontracting with other companies. The final result of these offerings depends on negotiations at the scale of millions of small companies. This scale requires new platforms for supporting digital business ecosystems, as well as related services like open-id, trust management, monitors and recommenders. This is done in the Open Negotiation Environment (ONE), which is an open-source platform that allows agents, on behalf of small companies, to negotiate and use the ecosystem services, and enables the development of new agent technologies. The methods and tools of cyber engineering are necessary to build up Open Negotiation Environments that are stable, a basic condition for predictable business and reliable business environments. Aiming to build stable digital business ecosystems by means of improved collective intelligence, we introduce a model of negotiation style dynamics from the point of view of computational ecology. This model inspires an ecosystem monitor as well as a novel negotiation style recommender. The ecosystem monitor provides hints to the negotiation style recommender to achieve greater stability of an open negotiation environment in a digital business ecosystem. The greater stability provides the small companies with higher predictability, and therefore better business results. The negotiation style recommender is implemented with a simulated annealing algorithm at a constant temperature, and its impact is shown by applying it to a real case of an open negotiation environment populated by Italian companies
Resumo:
Des del seu descobriment, a la molècula C60 se li coneixen una varietat de derivats segons el tipus de funcionalització amb propietats fisicoquímiques específiques de gran interès científic. Una sel·lecció de derivats corresponents a addicions simple o múltiple al C60 s'ha considerat en aquest treball d'investigació. L'estudi a nivell de química computacional de diversos tipus d'addició al C60 s'han portat a terme per tal de poder donar resposta a aspectes que experimentalment no s'entenen o són poc clars. Els sistemes estudiats en referència a l'addició simple al C60 han estat en primer lloc els monoiminoful·lerens, C60NR, (de les dues vies proposades per la seva síntesi, anàlisis cinètic i termodinàmic han ajudat a explicar els mecanismes de formació i justificar l'addició a enllaços tipus [5,6]), i en segon lloc els metanoful·lerens i els hidroful·lerens substituits, C60CHR i C60HR, (raons geomètriques, electròniques, energètiques i magnètiques justifiquen el diferent caràcter àcid ente ambdós derivats tenint en compte una sèrie de substituents R amb diferent caràcter electrònic donor/acceptor). Els fluoroful·lerens, C60Fn, i els epoxid ful·lerens, C60On, (anàlisi sistemàtic dels seus patrons d'addició en base a poder justificar la força que els governa han aportat dades complementàries a les poques que existeixen experimentalment al respecte).
Resumo:
[EU]Lan honetan semantika distribuzionalaren eta ikasketa automatikoaren erabilera aztertzen dugu itzulpen automatiko estatistikoa hobetzeko. Bide horretan, erregresio logistikoan oinarritutako ikasketa automatikoko eredu bat proposatzen dugu hitz-segiden itzulpen- probabilitatea modu dinamikoan modelatzeko. Proposatutako eredua itzulpen automatiko estatistikoko ohiko itzulpen-probabilitateen orokortze bat dela frogatzen dugu, eta testuinguruko nahiz semantika distribuzionaleko informazioa barneratzeko baliatu ezaugarri lexiko, hitz-cluster eta hitzen errepresentazio bektorialen bidez. Horretaz gain, semantika distribuzionaleko ezagutza itzulpen automatiko estatistikoan txertatzeko beste hurbilpen bat lantzen dugu: hitzen errepresentazio bektorial elebidunak erabiltzea hitz-segiden itzulpenen antzekotasuna modelatzeko. Gure esperimentuek proposatutako ereduen baliagarritasuna erakusten dute, emaitza itxaropentsuak eskuratuz oinarrizko sistema sendo baten gainean. Era berean, gure lanak ekarpen garrantzitsuak egiten ditu errepresentazio bektorialen mapaketa elebidunei eta hitzen errepresentazio bektorialetan oinarritutako hitz-segiden antzekotasun neurriei dagokienean, itzulpen automatikoaz haratago balio propio bat dutenak semantika distribuzionalaren arloan.
Resumo:
Background: The computational grammatical complexity ( CGC) hypothesis claims that children with G(rammatical)-specific language impairment ( SLI) have a domain-specific deficit in the computational system affecting syntactic dependencies involving 'movement'. One type of such syntactic dependencies is filler-gap dependencies. In contrast, the Generalized Slowing Hypothesis claims that SLI children have a domain-general deficit affecting processing speed and capacity. Aims: To test contrasting accounts of SLI we investigate processing of syntactic (filler-gap) dependencies in wh-questions. Methods & Procedures: Fourteen 10; 2 - 17; 2 G-SLI children, 14 age- matched and 17 vocabulary-matched controls were studied using the cross- modal picturepriming paradigm. Outcomes & Results: G-SLI children's processing speed was significantly slower than the age controls, but not younger vocabulary controls. The G- SLI children and vocabulary controls did not differ on memory span. However, the typically developing and G-SLI children showed a qualitatively different processing pattern. The age and vocabulary controls showed priming at the gap, indicating that they process wh-questions through syntactic filler-gap dependencies. In contrast, G-SLI children showed priming only at the verb. Conclusions: The findings indicate that G-SLI children fail to establish reliably a syntactic filler- gap dependency and instead interpret wh-questions via lexical thematic information. These data challenge the Generalized Slowing Hypothesis account, but support the CGC hypothesis, according to which G-SLI children have a particular deficit in the computational system affecting syntactic dependencies involving 'movement'. As effective remediation often depends on aetiological insight, the discovery of the nature of the syntactic deficit, along side a possible compensatory use of semantics to facilitate sentence processing, can be used to direct therapy. However, the therapeutic strategy to be used, and whether such similar strengths and weaknesses within the language system are found in other SLI subgroups are empirical issues that warrant further research.
Resumo:
Iconicity is the non-arbitrary relation between properties of a phonological form and semantic content (e.g. “moo”, “splash”). It is a common feature of both spoken and signed languages, and recent evidence shows that iconic forms confer an advantage during word learning. We explored whether iconic forms conferred a processing advantage for 13 individuals with aphasia following left-hemisphere stroke. Iconic and control words were compared in four different tasks: repetition, reading aloud, auditory lexical decision and visual lexical decision. An advantage for iconic words was seen for some individuals in all tasks, with consistent group effects emerging in reading aloud and auditory lexical decision. Both these tasks rely on mapping between semantics and phonology. We conclude that iconicity aids spoken word processing for individuals with aphasia. This advantage may be due to a stronger connection between semantic information and phonological forms.
Resumo:
The subgradient optimization method is a simple and flexible linear programming iterative algorithm. It is much simpler than Newton's method and can be applied to a wider variety of problems. It also converges when the objective function is non-differentiable. Since an efficient algorithm will not only produce a good solution but also take less computing time, we always prefer a simpler algorithm with high quality. In this study a series of step size parameters in the subgradient equation is studied. The performance is compared for a general piecewise function and a specific p-median problem. We examine how the quality of solution changes by setting five forms of step size parameter.
Resumo:
This paper analyzes some forms of linguistic manipulation in Japanese in newspapers when reporting on North Korea and its nuclear tests. The focus lies on lexical ambiguity in headlines and journalist’s voices in the body of the articles, that results in manipulation of the minds of the readers. The study is based on a corpus of nine articles from two of Japan’s largest newspapers Yomiuri Online and Asahi Shimbun Digital. The linguistic phenomenon that contribute to create manipulation are divided into Short Term Memory impact or Long Term Memory impact and examples will be discussed under each of the categories.The main results of the study are that headlines in Japanese newspapers do not make use of an ambiguous, double grounded structure. However, the articles are filled with explicit and implied attitudes as well as attributed material from people of a high social status, which suggests that manipulation of the long term memory is a tool used in Japanese media.
Resumo:
One of the first questions to consider when designing a new roll forming line is the number of forming steps required to produce a profile. The number depends on material properties, the cross-section geometry and tolerance requirements, but the tool designer also wants to minimize the number of forming steps in order to reduce the investment costs for the customer. There are several computer aided engineering systems on the market that can assist the tool designing process. These include more or less simple formulas to predict deformation during forming as well as the number of forming steps. In recent years it has also become possible to use finite element analysis for the design of roll forming processes. The objective of the work presented in this thesis was to answer the following question: How should the roll forming process be designed for complex geometries and/or high strength steels? The work approach included both literature studies as well as experimental and modelling work. The experimental part gave direct insight into the process and was also used to develop and validate models of the process. Starting with simple geometries and standard steels the work progressed to more complex profiles of variable depth and width, made of high strength steels. The results obtained are published in seven papers appended to this thesis. In the first study (see paper 1) a finite element model for investigating the roll forming of a U-profile was built. It was used to investigate the effect on longitudinal peak membrane strain and deformation length when yield strength increases, see paper 2 and 3. The simulations showed that the peak strain decreases whereas the deformation length increases when the yield strength increases. The studies described in paper 4 and 5 measured roll load, roll torque, springback and strain history during the U-profile forming process. The measurement results were used to validate the finite element model in paper 1. The results presented in paper 6 shows that the formability of stainless steel (e.g. AISI 301), that in the cold rolled condition has a large martensite fraction, can be substantially increased by heating the bending zone. The heated area will then become austenitic and ductile before the roll forming. Thanks to the phenomenon of strain induced martensite formation, the steel will regain the martensite content and its strength during the subsequent plastic straining. Finally, a new tooling concept for profiles with variable cross-sections is presented in paper 7. The overall conclusions of the present work are that today, it is possible to successfully develop profiles of complex geometries (3D roll forming) in high strength steels and that finite element simulation can be a useful tool in the design of the roll forming process.
Resumo:
In this paper, we propose a new method for solving large scale p-median problem instances based on real data. We compare different approaches in terms of runtime, memory footprint and quality of solutions obtained. In order to test the different methods on real data, we introduce a new benchmark for the p-median problem based on real Swedish data. Because of the size of the problem addressed, up to 1938 candidate nodes, a number of algorithms, both exact and heuristic, are considered. We also propose an improved hybrid version of a genetic algorithm called impGA. Experiments show that impGA behaves as well as other methods for the standard set of medium-size problems taken from Beasley’s benchmark, but produces comparatively good results in terms of quality, runtime and memory footprint on our specific benchmark based on real Swedish data.
Resumo:
Schistosomiasis is still an endemic disease in many regions, with 250 million people infected with Schistosoma and about 500,000 deaths per year. Praziquantel (PZQ) is the drug of choice for schistosomiasis treatment, however it is classified as Class II in the Biopharmaceutics Classification System, as its low solubility hinders its performance in biological systems. The use of cyclodextrins is a useful tool to increase the solubility and bioavailability of drugs. The aim of this work was to prepare an inclusion compound of PZQ and methyl-beta-cyclodextrin (MeCD), perform its physico-chemical characterization, and explore its in vitro cytotoxicity. SEM showed a change of the morphological characteristics of PZQ:MeCD crystals, and IR data supported this finding, with changes after interaction with MeCD including effects on the C-H of the aromatic ring, observed at 758 cm(-1). Differential scanning calorimetry measurements revealed that complexation occurred in a 1:1 molar ratio, as evidenced by the lack of a PZQ transition temperature after inclusion into the MeCD cavity. In solution, the PZQ UV spectrum profile in the presence of MeCD was comparable to the PZQ spectrum in a hydrophobic solvent. Phase solubility diagrams showed that there was a 5.5-fold increase in PZQ solubility, and were indicative of a type A(L) isotherm, that was used to determine an association constant (K(a)) of 140.8 M(-1). No cytotoxicity of the PZQ:MeCD inclusion compound was observed in tests using 3T3 cells. The results suggest that the association of PZQ with MeCD could be a good alternative for the treatment of schistosomiasis.