830 resultados para Information needs – representation
Resumo:
Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. Generalisation is measured by the performance on independent test data drawn from the same distribution as the training data. Such performance can be quantified by the posterior average of the information divergence between the true and the model distributions. Averaging over the Bayesian posterior guarantees internal coherence; Using information divergence guarantees invariance with respect to representation. The theory generalises the least mean squares theory for linear Gaussian models to general problems of statistical estimation. The main results are: (1)~the ideal optimal estimate is always given by average over the posterior; (2)~the optimal estimate within a computational model is given by the projection of the ideal estimate to the model. This incidentally shows some currently popular methods dealing with hyperpriors are in general unnecessary and misleading. The extension of information divergence to positive normalisable measures reveals a remarkable relation between the dlt dual affine geometry of statistical manifolds and the geometry of the dual pair of Banach spaces Ld and Ldd. It therefore offers conceptual simplification to information geometry. The general conclusion on the issue of evaluating neural network learning rules and other statistical inference methods is that such evaluations are only meaningful under three assumptions: The prior P(p), describing the environment of all the problems; the divergence Dd, specifying the requirement of the task; and the model Q, specifying available computing resources.
Resumo:
We develop an approach for a sparse representation for Gaussian Process (GP) models in order to overcome the limitations of GPs caused by large data sets. The method is based on a combination of a Bayesian online algorithm together with a sequential construction of a relevant subsample of the data which fully specifies the prediction of the model. Experimental results on toy examples and large real-world datasets indicate the efficiency of the approach.
Resumo:
Graphic depiction is an established method for academics to present concepts about theories of innovation. These expressions have been adopted by policy-makers, the media and businesses. However, there has been little research on the extent of their usage or effectiveness ex-academia. In addition, innovation theorists have ignored this area of study, despite the communication of information about innovation being acknowledged as a major determinant of success for corporate enterprise. The thesis explores some major themes in the theories of innovation and compares how graphics are used to represent them. The thesis examines the contribution of visual sociology and graphic theory to an investigation of a sample of graphics. The methodological focus is a modified content analysis. The following expressions are explored: check lists, matrices, maps and mapping in the management of innovation; models, flow charts, organisational charts and networks in the innovation process; and curves and cycles in the representation of performance and progress. The main conclusion is that academia is leading the way in usage as well as novelty. The graphic message is switching from prescription to description. The computerisation of graphics has created a major role for the information designer. It is recommended that use of the graphic representation of innovation should be increased in all domains, though it is conceded that its content and execution need to improve, too. Education of graphic 'producers', 'intermediaries' and 'consumers' will play a part in this, as will greater exploration of diversity, novelty and convention. Work has begun to tackle this and suggestions for future research are made.
Resumo:
To meet changing needs of customers and to survive in the increasingly globalised and competitive environment, it is necessary for companies to equip themselves with intelligent tools, thereby enabling managerial levels to use the tactical decision in a better way. However, the implementation of an intelligent system is always a challenge in Small- and Medium-sized Enterprises (SMEs). Therefore, a new and simple approach with 'process rethinking' ability is proposed to generate ongoing process improvements over time. In this paper, a roadmap of the development of an agent-based information system is described. A case example has also been provided to show how the system can assist non-specialists, for example, managers and engineers to make right decisions for a continual process improvement. Copyright © 2006 Inderscience Enterprises Ltd.
Resumo:
Information systems have developed to the stage that there is plenty of data available in most organisations but there are still major problems in turning that data into information for management decision making. This thesis argues that the link between decision support information and transaction processing data should be through a common object model which reflects the real world of the organisation and encompasses the artefacts of the information system. The CORD (Collections, Objects, Roles and Domains) model is developed which is richer in appropriate modelling abstractions than current Object Models. A flexible Object Prototyping tool based on a Semantic Data Storage Manager has been developed which enables a variety of models to be stored and experimented with. A statistical summary table model COST (Collections of Objects Statistical Table) has been developed within CORD and is shown to be adequate to meet the modelling needs of Decision Support and Executive Information Systems. The COST model is supported by a statistical table creator and editor COSTed which is also built on top of the Object Prototyper and uses the CORD model to manage its metadata.
Resumo:
A multi-chromosome GA (Multi-GA) was developed, based upon concepts from the natural world, allowing improved flexibility in a number of areas including representation, genetic operators, their parameter rates and real world multi-dimensional applications. A series of experiments were conducted, comparing the performance of the Multi-GA to a traditional GA on a number of recognised and increasingly complex test optimisation surfaces, with promising results. Further experiments demonstrated the Multi-GA's flexibility through the use of non-binary chromosome representations and its applicability to dynamic parameterisation. A number of alternative and new methods of dynamic parameterisation were investigated, in addition to a new non-binary 'Quotient crossover' mechanism. Finally, the Multi-GA was applied to two real world problems, demonstrating its ability to handle mixed type chromosomes within an individual, the limited use of a chromosome level fitness function, the introduction of new genetic operators for structural self-adaptation and its viability as a serious real world analysis tool. The first problem involved optimum placement of computers within a building, allowing the Multi-GA to use multiple chromosomes with different type representations and different operators in a single individual. The second problem, commonly associated with Geographical Information Systems (GIS), required a spatial analysis location of the optimum number and distribution of retail sites over two different population grids. In applying the Multi-GA, two new genetic operators (addition and deletion) were developed and explored, resulting in the definition of a mechanism for self-modification of genetic material within the Multi-GA structure and a study of this behaviour.
Resumo:
In Information Filtering (IF) a user may be interested in several topics in parallel. But IF systems have been built on representational models derived from Information Retrieval and Text Categorization, which assume independence between terms. The linearity of these models results in user profiles that can only represent one topic of interest. We present a methodology that takes into account term dependencies to construct a single profile representation for multiple topics, in the form of a hierarchical term network. We also introduce a series of non-linear functions for evaluating documents against the profile. Initial experiments produced positive results.
Resumo:
The research is concerned with the terminological problems that computer users experience when they try to formulate their knowledge needs and attempt to access information contained in computer manuals or online help systems while building up their knowledge. This is the recognised but unresolved problem of communication between the specialist and the layman. The initial hypothesis was that computer users, through their knowledge of language, have some prior knowledge of the subdomain of computing they are trying to come to terms with, and that language can be a facilitating mechanism, or an obstacle, in the development of that knowledge. Related to this is the supposition that users have a conceptual apparatus based on both theoretical knowledge and experience of the world, and of several domains of special reference related to the environment in which they operate. The theoretical argument was developed by exploring the relationship between knowledge and language, and considering the efficacy of terms as agents of special subject knowledge representation. Having charted in a systematic way the territory of knowledge sources and types, we were able to establish that there are many aspects of knowledge which cannot be represented by terms. This submission is important, as it leads to the realisation that significant elements of knowledge are being disregarded in retrieval systems because they are normally expressed by language elements which do not enjoy the status of terms. Furthermore, we introduced the notion of `linguistic ease of retrieval' as a challenge to more conventional thinking which focuses on retrieval results.
Resumo:
This system is concerned with the design and implementation of a community health information system which fulfils some of the local needs of fourteen nursing and para-medical professions in a district health authority, whilst satisfying the statutory requirements of the NHS Korner steering group for those professions. A national survey of community health computer applications, documented in the form of an applications register, shows the need for such a system. A series of general requirements for an informations systems design methodology are identified, together with specific requirements for this problem situation. A number of existing methodologies are reviewed, but none of these were appropriate for this application. Some existing approaches, tools and techniques are used to define a more suitable methodology. It is unreasonable to rely on one single general methodology for all types of application development. There is a need for pragmatism, adaptation and flexibility. In this research, participation in the development stages by those who will eventually use the system was thought desirable. This was achieved by forming a representative design group. Results would seem to show a highly favourable response from users to this participation which contributed to the overall success of the system implemented. A prototype was developed for the chiropody and school nursing staff groups of Darlington health authority, and evaluations show that a significant number of the problems and objectives of those groups have been successfully addressed; the value of community health information has been increased; and information has been successfully fed back to staff and better utilised.
Resumo:
Concern has been expressed in the professional literature - borne out by professional experience and observation - that the supply and demand relationship existing between the 13 English and Welsh Library and Information Studies (LIS) Schools (as providers of `First Professional' staff) and the Higher Education Library and Information Services (HE LIS) sector of England and Wales (as one group of employers of such staff) is unsatisfactory and needs attention. An appropriate methodology to investigate this problem was devised. A basic content analysis of Schools' curricular and recruitment material intended for public consumption was undertaken to establish an overview of the LIS initial professional education system in England and Wales, and to identify and analyse any covert messages imparted to readers. This was followed by a mix of Main Questionnaires and Semi-Structured Interviews with appropriate populations. The investigation revealed some serious areas of dissatisfaction by the HE LIS Chiefs with the role and function of the Schools. Considerable divergence of views emerged on the state of the working relationships between the two sectors and on the Schools' successes in meeting the needs of the HE LIS sector and on CPD provision. There were, however, areas of substantial and consistent agreement between the two sectors. The main implications of the findings were that those areas encompassing divergence of views were worrying and needed addressing by both sides. Possible ways forward included recommendations on improving the image of the profession purveyed by the Schools; the forming of closer and more effective inter-sectoral relationships; recognising fully the importance of `practicum' and increasing and sustaining the network of `practicum' providers.
Resumo:
The purpose of this paper is to deal with the outcomes of a so-called “employability management needs analysis” that is meant to provide more insight into current employability management activities and its possible benefits for Information and Communication Technology (ICT) professionals working in Small- and Medium-sized enterprises (SMEs) throughout Europe. A considerable series of interviews (N=107) were conducted with managers in SMEs in seven European countries, including Germany, Greece, Italy, the Netherlands, Norway, Poland, and the UK. A semi-structured interview protocol was used during the interviews to cover three issues: employability (13 items), ageing (8 items), and future developments and requirements (13 items). Analysis of all final interview transcriptions was at a national level using an elaborate common coding scheme. Although an interest in employability emerged, actual policy and action lagged behind. The recession in the ICT sector at the time of the investigation and the developmental stage of the sector in each participating country appeared connected. Ageing was not seen as a major issue in the ICT sector because managers considered ICT to be a relatively young sector. There appeared to be a serious lack of investment in the development of expertise of ICT professionals. Generalization of the results to large organizations in the ICT sector should be made with caution. The interview protocol developed is of value for further research and complements survey research undertaken within the employability field of study. It can be concluded that proactive HRM (Human Resource Management) policies and strategies are essential, even in times of economic downturn. Employability management activities are especially important in the light of current career issues. The study advances knowledge regarding HRM practices adopted by SMEs in the ICT sector, especially as there is a gap in knowledge about career development issues in that particular sector.
Resumo:
This research investigates the contribution that Geographic Information Systems (GIS) can make to the land suitability process used to determine the effects of a climate change scenario. The research is intended to redress the severe under representation of Developing countries within the literature examining the impacts of climatic change upon crop productivity. The methodology adopts some of the Intergovernmental Panel on Climate Change (IPCC) estimates for regional climate variations, based upon General Circulation Model predictions (GCMs) and applies them to a baseline climate for Bangladesh. Utilising the United Nations Food & Agricultural Organisation's Agro-ecological Zones land suitability methodology and crop yield model, the effects of the scenario upon agricultural productivity on 14 crops are determined. A Geographic Information System (IDRISI) is adopted in order to facilitate the methodology, in conjunction with a specially designed spreadsheet, used to determine the yield and suitability rating for each crop. A simple optimisation routine using the GIS is incorporated to provide an indication of the 'maximum theoretical' yield available to the country, should the most calorifically significant crops be cultivated on each land unit both before and after the climate change scenario. This routine will provide an estimate of the theoretical population supporting capacity of the country, both now and in the future, to assist with planning strategies and research. The research evaluates the utility of this alternative GIS based methodology for the land evaluation process and determines the relative changes in crop yields that may result from changes in temperature, photosynthesis and flooding hazard frequency. In summary, the combination of a GIS and a spreadsheet was successful, the yield prediction model indicates that the application of the climate change scenario will have a deleterious effect upon the yields of the study crops. Any yield reductions will have severe implications for agricultural practices. The optimisation routine suggests that the 'theoretical maximum' population supporting capacity is well in excess of current and future population figures. If this agricultural potential could be realised however, it may provide some amelioration from the effects of climate change.
Resumo:
Time after time… and aspect and mood. Over the last twenty five years, the study of time, aspect and - to a lesser extent - mood acquisition has enjoyed increasing popularity and a constant widening of its scope. In such a teeming field, what can be the contribution of this book? We believe that it is unique in several respects. First, this volume encompasses studies from different theoretical frameworks: functionalism vs generativism or function-based vs form-based approaches. It also brings together various sub-fields (first and second language acquisition, child and adult acquisition, bilingualism) that tend to evolve in parallel rather than learn from each other. A further originality is that it focuses on a wide range of typologically different languages, and features less studied languages such as Korean and Bulgarian. Finally, the book gathers some well-established scholars, young researchers, and even research students, in a rich inter-generational exchange, that ensures the survival but also the renewal and the refreshment of the discipline. The book at a glance The first part of the volume is devoted to the study of child language acquisition in monolingual, impaired and bilingual acquisition, while the second part focuses on adult learners. In this section, we will provide an overview of each chapter. The first study by Aviya Hacohen explores the acquisition of compositional telicity in Hebrew L1. Her psycholinguistic approach contributes valuable data to refine theoretical accounts. Through an innovating methodology, she gathers information from adults and children on the influence of definiteness, number, and the mass vs countable distinction on the constitution of a telic interpretation of the verb phrase. She notices that the notion of definiteness is mastered by children as young as 10, while the mass/count distinction does not appear before 10;7. However, this does not entail an adult-like use of telicity. She therefore concludes that beyond definiteness and noun type, pragmatics may play an important role in the derivation of Hebrew compositional telicity. For the second chapter we move from a Semitic language to a Slavic one. Milena Kuehnast focuses on the acquisition of negative imperatives in Bulgarian, a form that presents the specificity of being grammatical only with the imperfective form of the verb. The study examines how 40 Bulgarian children distributed in two age-groups (15 between 2;11-3;11, and 25 between 4;00 and 5;00) develop with respect to the acquisition of imperfective viewpoints, and the use of imperfective morphology. It shows an evolution in the recourse to expression of force in the use of negative imperatives, as well as the influence of morphological complexity on the successful production of forms. With Yi-An Lin’s study, we concentrate both on another type of informant and of framework. Indeed, he studies the production of children suffering from Specific Language Impairment (SLI), a developmental language disorder the causes of which exclude cognitive impairment, psycho-emotional disturbance, and motor-articulatory disorders. Using the Leonard corpus in CLAN, Lin aims to test two competing accounts of SLI (the Agreement and Tense Omission Model [ATOM] and his own Phonetic Form Deficit Model [PFDM]) that conflicts on the role attributed to spellout in the impairment. Spellout is the point at which the Computational System for Human Language (CHL) passes over the most recently derived part of the derivation to the interface components, Phonetic Form (PF) and Logical Form (LF). ATOM claims that SLI sufferers have a deficit in their syntactic representation while PFDM suggests that the problem only occurs at the spellout level. After studying the corpus from the point of view of tense / agreement marking, case marking, argument-movement and auxiliary inversion, Lin finds further support for his model. Olga Gupol, Susan Rohstein and Sharon Armon-Lotem’s chapter offers a welcome bridge between child language acquisition and multilingualism. Their study explores the influence of intensive exposure to L2 Hebrew on the development of L1 Russian tense and aspect morphology through an elicited narrative. Their informants are 40 Russian-Hebrew sequential bilingual children distributed in two age groups 4;0 – 4;11 and 7;0 - 8;0. They come to the conclusion that bilingual children anchor their narratives in perfective like monolinguals. However, while aware of grammatical aspect, bilinguals lack the full form-function mapping and tend to overgeneralize the imperfective on the principles of simplicity (as imperfective are the least morphologically marked forms), universality (as it covers more functions) and interference. Rafael Salaberry opens the second section on foreign language learners. In his contribution, he reflects on the difficulty L2 learners of Spanish encounter when it comes to distinguishing between iterativity (conveyed with the use of the preterite) and habituality (expressed through the imperfect). He examines in turn the theoretical views that see, on the one hand, habituality as part of grammatical knowledge and iterativity as pragmatic knowledge, and on the other hand both habituality and iterativity as grammatical knowledge. He comes to the conclusion that the use of preterite as a default past tense marker may explain the impoverished system of aspectual distinctions, not only at beginners but also at advanced levels, which may indicate that the system is differentially represented among L1 and L2 speakers. Acquiring the vast array of functions conveyed by a form is therefore no mean feat, as confirmed by the next study. Based on the prototype theory, Kathleen Bardovi-Harlig’s chapter focuses on the development of the progressive in L2 English. It opens with an overview of the functions of the progressive in English. Then, a review of acquisition research on the progressive in English and other languages is provided. The bulk of the chapter reports on a longitudinal study of 16 learners of L2 English and shows how their use of the progressive expands from the prototypical uses of process and continuousness to the less prototypical uses of repetition and future. The study concludes that the progressive spreads in interlanguage in accordance with prototype accounts. However, it suggests additional stages, not predicted by the Aspect Hypothesis, in the development from activities and accomplishments at least for the meaning of repeatedness. A similar theoretical framework is adopted in the following chapter, but it deals with a lesser studied language. Hyun-Jin Kim revisits the claims of the Aspect Hypothesis in relation to the acquisition of L2 Korean by two L1 English learners. Inspired by studies on L2 Japanese, she focuses on the emergence and spread of the past / perfective marker ¬–ess- and the progressive – ko iss- in the interlanguage of her informants throughout their third and fourth semesters of study. The data collected through six sessions of conversational interviews and picture description tasks seem to support the Aspect Hypothesis. Indeed learners show a strong association between past tense and accomplishments / achievements at the start and a gradual extension to other types; a limited use of past / perfective marker with states and an affinity of progressive with activities / accomplishments and later achievements. In addition, - ko iss– moves from progressive to resultative in the specific category of Korean verbs meaning wear / carry. While the previous contributions focus on function, Evgeniya Sergeeva and Jean-Pierre Chevrot’s is interested in form. The authors explore the acquisition of verbal morphology in L2 French by 30 instructed native speakers of Russian distributed in a low and high levels. They use an elicitation task for verbs with different models of stem alternation and study how token frequency and base forms influence stem selection. The analysis shows that frequency affects correct production, especially among learners with high proficiency. As for substitution errors, it appears that forms with a simple structure are systematically more frequent than the target form they replace. When a complex form serves as a substitute, it is more frequent only when it is replacing another complex form. As regards the use of base forms, the 3rd person singular of the present – and to some extent the infinitive – play this role in the corpus. The authors therefore conclude that the processing of surface forms can be influenced positively or negatively by the frequency of the target forms and of other competing stems, and by the proximity of the target stem to a base form. Finally, Martin Howard’s contribution takes up the challenge of focusing on the poorer relation of the TAM system. On the basis of L2 French data obtained through sociolinguistic interviews, he studies the expression of futurity, conditional and subjunctive in three groups of university learners with classroom teaching only (two or three years of university teaching) or with a mixture of classroom teaching and naturalistic exposure (2 years at University + 1 year abroad). An analysis of relative frequencies leads him to suggest a continuum of use going from futurate present to conditional with past hypothetic conditional clauses in si, which needs to be confirmed by further studies. Acknowledgements The present volume was inspired by the conference Acquisition of Tense – Aspect – Mood in First and Second Language held on 9th and 10th February 2008 at Aston University (Birmingham, UK) where over 40 delegates from four continents and over a dozen countries met for lively and enjoyable discussions. This collection of papers was double peer-reviewed by an international scientific committee made of Kathleen Bardovi-Harlig (Indiana University), Christine Bozier (Lund Universitet), Alex Housen (Vrije Universiteit Brussel), Martin Howard (University College Cork), Florence Myles (Newcastle University), Urszula Paprocka (Catholic University of Lublin), †Clive Perdue (Université Paris 8), Michel Pierrard (Vrije Universiteit Brussel), Rafael Salaberry (University of Texas at Austin), Suzanne Schlyter (Lund Universitet), Richard Towell (Salford University), and Daniel Véronique (Université d’Aix-en-Provence). We are very much indebted to that scientific committee for their insightful input at each step of the project. We are also thankful for the financial support of the Association for French Language Studies through its workshop grant, and to the Aston Modern Languages Research Foundation for funding the proofreading of the manuscript.
Resumo:
This paper summarizes the scientific work presented at the 32nd European Conference on Information Retrieval. It demonstrates that information retrieval (IR) as a research area continues to thrive with progress being made in three complementary sub-fields, namely IR theory and formal methods together with indexing and query representation issues, furthermore Web IR as a primary application area and finally research into evaluation methods and metrics. It is the combination of these areas that gives IR its solid scientific foundations. The paper also illustrates that significant progress has been made in other areas of IR. The keynote speakers addressed three such subject fields, social search engines using personalization and recommendation technologies, the renewed interest in applying natural language processing to IR, and multimedia IR as another fast-growing area.