13 resultados para keyword

em Aston University Research Archive


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of ontologies as representations of knowledge is widespread but their construction, until recently, has been entirely manual. We argue in this paper for the use of text corpora and automated natural language processing methods for the construction of ontologies. We delineate the challenges and present criteria for the selection of appropriate methods. We distinguish three ma jor steps in ontology building: associating terms, constructing hierarchies and labelling relations. A number of methods are presented for these purposes but we conclude that the issue of data-sparsity still is a ma jor challenge. We argue for the use of resources external tot he domain specific corpus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the original Data Envelopment Analysis (DEA) study by Charnes et al. [Measuring the efficiency of decision-making units. European Journal of Operational Research 1978;2(6):429–44], there has been rapid and continuous growth in the field. As a result, a considerable amount of published research has appeared, with a significant portion focused on DEA applications of efficiency and productivity in both public and private sector activities. While several bibliographic collections have been reported, a comprehensive listing and analysis of DEA research covering its first 30 years of history is not available. This paper thus presents an extensive, if not nearly complete, listing of DEA research covering theoretical developments as well as “real-world” applications from inception to the year 2007. A listing of the most utilized/relevant journals, a keyword analysis, and selected statistics are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present bibliography includes most of the references published in the field of Data Envelopment Analysis (DEA) up to the year 2007. Some publications in 2007 are also listed, but not included in the statistics. To the best of the authors’ knowledge, this listing appears to be the most complete source of references on DEA and its applications in measuring the efficiency and productivity of decision making units (DMUs). The authors hope that this new updated bibliography will assist researchers and other scholars as they develop new frontiers in DEA. The most utilized/relevant journals, a keyword analysis, and selected statistics are also presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Keyword identification in one of two simultaneous sentences is improved when the sentences differ in F0, particularly when they are almost continuously voiced. Sentences of this kind were recorded, monotonised using PSOLA, and re-synthesised to give a range of harmonic ?F0s (0, 1, 3, and 10 semitones). They were additionally re-synthesised by LPC with the LPC residual frequency shifted by 25% of F0, to give excitation with inharmonic but regularly spaced components. Perceptual identification of frequency-shifted sentences showed a similar large improvement with nominal ?F0 as seen for harmonic sentences, although overall performance was about 10% poorer. We compared performance with that of two autocorrelation-based computational models comprising four stages: (i) peripheral frequency selectivity and half-wave rectification; (ii) within-channel periodicity extraction; (iii) identification of the two major peaks in the summary autocorrelation function (SACF); (iv) a template-based approach to speech recognition using dynamic time warping. One model sampled the correlogram at the target-F0 period and performed spectral matching; the other deselected channels dominated by the interferer and performed matching on the short-lag portion of the residual SACF. Both models reproduced the monotonic increase observed in human performance with increasing ?F0 for the harmonic stimuli, but not for the frequency-shifted stimuli. A revised version of the spectral-matching model, which groups patterns of periodicity that lie on a curve in the frequency-delay plane, showed a closer match to the perceptual data for frequency-shifted sentences. The results extend the range of phenomena originally attributed to harmonic processing to grouping by common spectral pattern.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The thesis is divided into four chapters. They are: introduction, experimental, results and discussion about the free ligands and results and discussion about the complexes. The First Chapter, the introductory chapter, is a general introduction to the study of solid state reactions. The Second Chapter is devoted to the materials and experimental methods that have been used for carrying out tile experiments. TIle Third Chapter is concerned with the characterisations of free ligands (Picolinic acid, nicotinic acid, and isonicotinic acid) by using elemental analysis, IR spectra, X-ray diffraction, and mass spectra. Additionally, the thermal behaviour of free ligands in air has been studied by means of thermogravimetry (TG), derivative thermogravimetry (DTG), and differential scanning calorimetry (DSC) measurements. The behaviour of thermal decomposition of the three free ligands was not identical Finally, a computer program has been used for kinetic evaluation of non-isothermal differential scanning calorimetry data according to a composite and single heating rate methods in comparison with the methods due to Ozawa and Kissinger methods. The most probable reaction mechanism for the free ligands was the Avrami-Erofeev equation (A) that described the solid-state nucleation-growth mechanism. The activation parameters of the decomposition reaction for free ligands were calculated and the results of different methods of data analysis were compared and discussed. The Fourth Chapter, the final chapter, deals with the preparation of cobalt, nickel, and copper with mono-pyridine carboxylic acids in aqueous solution. The prepared complexes have been characterised by analyses, IR spectra, X-ray diffraction, magnetic moments, and electronic spectra. The stoichiometry of these compounds was ML2x(H20), (where M = metal ion, L = organic ligand and x = water molecule). The environments of cobalt, nickel, and copper nicotinates and the environments of cobalt and nickel picolinates were octahedral, whereas the environment of copper picolinate [Cu(PA)2] was tetragonal. However, the environments of cobalt, nickel, and copper isonicotinates were polymeric octahedral structures. The morphological changes that occurred throughout the decomposition were followed by SEM observation. TG, DTG, and DSC measurements have studied the thermal behaviour of the prepared complexes in air. During the degradation processes of the hydrated complexes, the crystallisation water molecules were lost in one or two steps. This was also followed by loss of organic ligands and the metal oxides remained. Comparison between the DTG temperatures of the first and second steps of the dehydration suggested that the water of crystallisation was more strongly bonded with anion in Ni(II) complexes than in the complexes of Co(II) and Cu(II). The intermediate products of decomposition were not identified. The most probable reaction mechanism for the prepared complexes was also Avrami-Erofeev equation (A) characteristic of solid-state nucleation-growth mechanism. The tempemture dependence of conductivity using direct current was determined for cobalt, nickel, Cl.nd copper isonicotinates. An activation energy (ΔΕ), the activation energy (ΔΕ ) were calculated.The ternperature and frequency dependence of conductivity, the frequency dependence of dielectric constant, and the dielectric loss for nickel isonicotinate were determined by using altemating current. The value of s paralneter and the value of'density of state [N(Ef)] were calculated. Keyword Thermal decomposition, kinetic, electrical conduclion, pyridine rnono~ carboxylic acid, cOlnplex, transition metal compJex.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Agent-based technology is playing an increasingly important role in today’s economy. Usually a multi-agent system is needed to model an economic system such as a market system, in which heterogeneous trading agents interact with each other autonomously. Two questions often need to be answered regarding such systems: 1) How to design an interacting mechanism that facilitates efficient resource allocation among usually self-interested trading agents? 2) How to design an effective strategy in some specific market mechanisms for an agent to maximise its economic returns? For automated market systems, auction is the most popular mechanism to solve resource allocation problems among their participants. However, auction comes in hundreds of different formats, in which some are better than others in terms of not only the allocative efficiency but also other properties e.g., whether it generates high revenue for the auctioneer, whether it induces stable behaviour of the bidders. In addition, different strategies result in very different performance under the same auction rules. With this background, we are inevitably intrigued to investigate auction mechanism and strategy designs for agent-based economics. The international Trading Agent Competition (TAC) Ad Auction (AA) competition provides a very useful platform to develop and test agent strategies in Generalised Second Price auction (GSP). AstonTAC, the runner-up of TAC AA 2009, is a successful advertiser agent designed for GSP-based keyword auction. In particular, AstonTAC generates adaptive bid prices according to the Market-based Value Per Click and selects a set of keyword queries with highest expected profit to bid on to maximise its expected profit under the limit of conversion capacity. Through evaluation experiments, we show that AstonTAC performs well and stably not only in the competition but also across a broad range of environments. The TAC CAT tournament provides an environment for investigating the optimal design of mechanisms for double auction markets. AstonCAT-Plus is the post-tournament version of the specialist developed for CAT 2010. In our experiments, AstonCAT-Plus not only outperforms most specialist agents designed by other institutions but also achieves high allocative efficiencies, transaction success rates and average trader profits. Moreover, we reveal some insights of the CAT: 1) successful markets should maintain a stable and high market share of intra-marginal traders; 2) a specialist’s performance is dependent on the distribution of trading strategies. However, typical double auction models assume trading agents have a fixed trading direction of either buy or sell. With this limitation they cannot directly reflect the fact that traders in financial markets (the most popular application of double auction) decide their trading directions dynamically. To address this issue, we introduce the Bi-directional Double Auction (BDA) market which is populated by two-way traders. Experiments are conducted under both dynamic and static settings of the continuous BDA market. We find that the allocative efficiency of a continuous BDA market mainly comes from rational selection of trading directions. Furthermore, we introduce a high-performance Kernel trading strategy in the BDA market which uses kernel probability density estimator built on historical transaction data to decide optimal order prices. Kernel trading strategy outperforms some popular intelligent double auction trading strategies including ZIP, GD and RE in the continuous BDA market by making the highest profit in static games and obtaining the best wealth in dynamic games.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes the design and evaluation of Aston-TAC, the runner-up in the Ad Auction Game of 2009 International Trading Agent Competition. In particular, we focus on how Aston-TAC generates adaptive bid prices according to the Market-based Value Per Click and how it selects a set of keyword queries to bid on to maximise the expected profit under limited conversion capacity. Through evaluation experiments, we show that AstonTAC performs well and stably not only in the competition but also across a broad range of environments. © 2010 The authors and IOS Press. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While semantic search technologies have been proven to work well in specific domains, they still have to confront two main challenges to scale up to the Web in its entirety. In this work we address this issue with a novel semantic search system that a) provides the user with the capability to query Semantic Web information using natural language, by means of an ontology-based Question Answering (QA) system [14] and b) complements the specific answers retrieved during the QA process with a ranked list of documents from the Web [3]. Our results show that ontology-based semantic search capabilities can be used to complement and enhance keyword search technologies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The goal of semantic search is to improve on traditional search methods by exploiting the semantic metadata. In this paper, we argue that supporting iterative and exploratory search modes is important to the usability of all search systems. We also identify the types of semantic queries the users need to make, the issues concerning the search environment and the problems that are intrinsic to semantic search in particular. We then review the four modes of user interaction in existing semantic search systems, namely keyword-based, form-based, view-based and natural language-based systems. Future development should focus on multimodal search systems, which exploit the advantages of more than one mode of interaction, and on developing the search systems that can search heterogeneous semantic metadata on the open semantic Web.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The debate about services-led competitive strategies continues to grow with much interest emerging around the differing practices between production and servitised operations. This paper contributes to this discussion byinvestigating the vertical integration practice (in particular the micro-vertical integration otherwise known as the supply chain position)of manufacturers who are successful in their adoption of servitization.Although these are preliminary findings from a longer-term research programme, through this technical note we seek to simultaneously contribute to the debate in the research community and offer guidance to practitioners exploring the consequences of servitization. Keyword: Servitization, Product-Service Systems, Through-life Services.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we explore the idea of social role theory (SRT) and propose a novel regularized topic model which incorporates SRT into the generative process of social media content. We assume that a user can play multiple social roles, and each social role serves to fulfil different duties and is associated with a role-driven distribution over latent topics. In particular, we focus on social roles corresponding to the most common social activities on social networks. Our model is instantiated on microblogs, i.e., Twitter and community question-answering (cQA), i.e., Yahoo! Answers, where social roles on Twitter include "originators" and "propagators", and roles on cQA are "askers" and "answerers". Both explicit and implicit interactions between users are taken into account and modeled as regularization factors. To evaluate the performance of our proposed method, we have conducted extensive experiments on two Twitter datasets and two cQA datasets. Furthermore, we also consider multi-role modeling for scientific papers where an author's research expertise area is considered as a social role. A novel application of detecting users' research interests through topical keyword labeling based on the results of our multi-role model has been presented. The evaluation results have shown the feasibility and effectiveness of our model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The role of source properties in across-formant integration was explored using three-formant (F1+F2+F3) analogues of natural sentences (targets). In experiment 1, F1+F3 were harmonic analogues (H1+H3) generated using a monotonous buzz source and second-order resonators; in experiment 2, F1+F3 were tonal analogues (T1+T3). F2 could take either form (H2 or T2). Target formants were always presented monaurally; the receiving ear was assigned randomly on each trial. In some conditions, only the target was present; in others, a competitor for F2 (F2C) was presented contralaterally. Buzz-excited or tonal competitors were created using the time-reversed frequency and amplitude contours of F2. Listeners must reject F2C to optimize keyword recognition. Whether or not a competitor was present, there was no effect of source mismatch between F1+F3 and F2. The impact of adding F2C was modest when it was tonal but large when it was harmonic, irrespective of whether F2C matched F1+F3. This pattern was maintained when harmonic and tonal counterparts were loudness-matched (experiment 3). Source type and competition, rather than acoustic similarity, governed the phonetic contribution of a formant. Contrary to earlier research using dichotic targets, requiring across-ear integration to optimize intelligibility, H2C was an equally effective informational masker for H2 as for T2.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study explored the effects on speech intelligibility of across-formant differences in fundamental frequency (ΔF0) and F0 contour. Sentence-length speech analogues were presented dichotically (left=F1+F3; right=F2), either alone or—because competition usually reveals grouping cues most clearly—accompanied in the left ear by a competitor for F2 (F2C) that listeners must reject to optimize recognition. F2C was created by inverting the F2 frequency contour. In experiment 1, all left-ear formants shared the same constant F0 and ΔF0F2 was 0 or ±4 semitones. In experiment 2, all left-ear formants shared the natural F0 contour and that for F2 was natural, constant, exaggerated, or inverted. Adding F2C lowered keyword scores, presumably because of informational masking. The results for experiment 1 were complicated by effects associated with the direction of ΔF0F2; this problem was avoided in experiment 2 because all four F0 contours had the same geometric mean frequency. When the target formants were presented alone, scores were relatively high and did not depend on the F0F2 contour. F2C impact was greater when F2 had a different F0 contour from the other formants. This effect was a direct consequence of the associated ΔF0; the F0F2 contour per se did not influence competitor impact.