882 resultados para Knowledge based system
Resumo:
Xerox Customer Engagement activity is informed by the "Go To Market" strategy, and "Intelligent Coverage" sales philosophy. The realisation of this philosophy necessitates a sophisticated level of Market Understanding, and the effective integration of the direct channels of Customer Engagement. Sophisticated Market Understanding requires the mapping and coding of the entire UK market at the DMU (Decision Making Unit) level, which in turn enables the creation of tailored coverage prescriptions. Effective Channel Integration is made possible by the organisation of Customer Engagement work according to a single, process defined structure: the Selling Process. Organising by process facilitates the discipline of Task Substitution, which leads logically to creation of Hybrid Selling models. Productive Customer Engagement requires Selling Process specialisation by industry sector, customer segment and product group. The research shows that Xerox's Market Database (MDB) plays a central role in delivering the Go To Market strategic aims. It is a tool for knowledge based selling, enables productive SFA (Sales Force Automation) and, in sum, is critical to the efficient and effective deployment of Customer Engagement resources. Intelligent Coverage is not possible without the MDB. Analysis of the case evidence has resulted in the definition of 60 idiographic statements. These statements are about how Xerox organise and manage three direct channels of Customer Engagement: Face to Face, Telebusiness and Ebusiness. Xerox is shown to employ a process-oriented, IT-enabled, holistic approach to Customer Engagement productivity. The significance of the research is that it represents a detailed (perhaps unequalled) level of rich description of the interplay between IT and a holistic, process-oriented management philosophy.
Resumo:
This thesis first considers the calibration and signal processing requirements of a neuromagnetometer for the measurement of human visual function. Gradiometer calibration using straight wire grids is examined and optimal grid configurations determined, given realistic constructional tolerances. Simulations show that for gradiometer balance of 1:104 and wire spacing error of 0.25mm the achievable calibration accuracy of gain is 0.3%, of position is 0.3mm and of orientation is 0.6°. Practical results with a 19-channel 2nd-order gradiometer based system exceed this performance. The real-time application of adaptive reference noise cancellation filtering to running-average evoked response data is examined. In the steady state, the filter can be assumed to be driven by a non-stationary step input arising at epoch boundaries. Based on empirical measures of this driving step an optimal progression for the filter time constant is proposed which improves upon fixed time constant filter performance. The incorporation of the time-derivatives of the reference channels was found to improve the performance of the adaptive filtering algorithm by 15-20% for unaveraged data, falling to 5% with averaging. The thesis concludes with a neuromagnetic investigation of evoked cortical responses to chromatic and luminance grating stimuli. The global magnetic field power of evoked responses to the onset of sinusoidal gratings was shown to have distinct chromatic and luminance sensitive components. Analysis of the results, using a single equivalent current dipole model, shows that these components arise from activity within two distinct cortical locations. Co-registration of the resulting current source localisations with MRI shows a chromatically responsive area lying along the midline within the calcarine fissure, possibly extending onto the lingual and cuneal gyri. It is postulated that this area is the human homologue of the primate cortical area V4.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
Lean is usually associated with the ‘operations’ of a manufacturing enterprise; however, there is a growing awareness that these principles may be transferred readily to other functions and sectors. The application to knowledge-based activities such as engineering design is of particular relevance to UK plc. Hence, the purpose of this study has been to establish the state-of-the-art, in terms of the adoption of Lean in new product development, by carrying out a systematic review of the literature. The authors' findings confirm the view that Lean can be applied beneficially away from the factory; that an understanding and definition of value is key to success; that a set-based (or Toyota methodology) approach to design is favoured together with the strong leadership of a chief engineer; and that the successful implementation requires organization-wide changes to systems, practices, and behaviour. On this basis it is felt that this review paper provides a useful platform for further research in this topic.
Resumo:
With the rebirth of coherent detection, various algorithms have come forth to alleviate phase noise, one of the main impairments for coherent receivers. These algorithms provide stable compensation, however they limit the DSP. With this key issue in mind, Fabry Perot filter based self coherent optical OFDM was analyzed which does not require phase noise compensation reducing the complexity in DSP at low OSNR. However, the performance of such a receiver is limited due to ASE noise at the carrier wavelength, especially since an optical amplifier is typically employed with the filter to ensure sufficient carrier power. Subsequently, the use of an injection-locked laser (ILL) to retrieve the frequency and phase information from the extracted carrier without the use of an amplifier was recently proposed. In ILL based system, an optical carrier is sent along with the OFDM signal in the transmitter. At the receiver, the carrier is extracted from the OFDM signal using a Fabry-Perot tunable filter and an ILL is used to significantly amplify the carrier and reduce intensity and phase noise. In contrast to CO-OFDM, such a system supports low-cost broad linewidth lasers and benefits with lower complexity in the DSP as no carrier frequency estimation and correction along with phase noise compensation is required.
Resumo:
Objectives The creation of more high-growth firms continues to be a key component of enterprise policy throughout the countries of the OECD. In the UK the developing enterprise policy framework highlights the importance of supporting businesses with growth potential. The difficulty, of course, is the ability of those delivering business support policies to accurately identify those businesses, especially at start-up, which will benefit from interventions and experiences an enhanced growth performance. This paper has a core objective of presenting new data on the number of high growth firms in the UK and providing an assessment of their economic significance. Approach This paper uses a specially created longitudinal firm-level database based on the Inter-Departmental Business Register (IDBR) held by the Office of National Statistics (ONS) for all private sector businesses in the UK for the period 1997-2008 to investigate the share of high-growth firms (including a sub-set of start-up more commonly referred to as gazelles) in successive cohorts of start-ups. We apply OECD definitions of high growth and gazelles to this database and are able to quantify for the first time their number (disaggregated by sector, region, size) and importance (employment and sales). Prior Work However, what is lacking at the core of this policy focus is any comprehensive statistical analysis of the scale and nature of high-growth firms in cohorts of new and established businesses. The evidence base in response to the question “Why do high-growth firms matter?” is surprisingly weak. Important work in this area has been initiated by Bartelsman et al., (2003),Hoffman and Jünge (2006) and Henreksen and Johansson (2009) but to date work in the UK has been limited (BERR, 2008b). Results We report that there are ~11,500 high growth firms in the UK in both 2005 and 2008. The share of high growth start-ups in the UK in 2005 (6.3%) was, contrary to the widely held perception in policy circles, higher than in the United States (5.2%). Of particular interest in the analysis are the growth trajectories (pattern of growth) of these firms as well as the extent to which they are restricted to technology-based or knowledge-based sectors. Implications and Value Using hitherto unused population data for the first time we have answered a fundamental research and policy question on the number and scale of high growth firms in the UK. We draw the conclusion that this ‘rare’ event does not readily lend itself to policy intervention on the grounds that the significant effort needed to identify such businesses ex ante would appear unjustified even if it was possible.
Resumo:
Purpose: The paper aims to explore the nature and purpose of higher education (HE) in the twenty-first century, focussing on how it can help fashion a green knowledge-based economy by developing approaches to learning and teaching that are social, networked and ecologically sensitive. Design/methodology/approach: The paper presents a discursive analysis of the skills and knowledge requirements of an emerging green knowledge-based economy using a range of policy focussed and academic research literature. Findings: The business opportunities that are emerging as a more sustainable world is developed requires the knowledge and skills that can capture and move then forward but in a complex and uncertain worlds learning needs to non-linear, creative and emergent. Practical implications: Sustainable learning and the attributes graduates will need to exhibit are prefigured in the activities and learning characterising the work and play facilitated by new media technologies. Social implications: Greater emphasis is required in higher learning understood as the capability to learn, adapt and direct sustainable change requires interprofessional co-operation that must utlise the potential of new media technologies to enhance social learning and collective intelligence. Originality/value: The practical relationship between low-carbon economic development, social sustainability and HE learning is based on both normative criteria and actual and emerging projections in economic, technological and skills needs.
Resumo:
We present a video-based system which interactively captures the geometry of a 3D object in the form of a point cloud, then recognizes and registers known objects in this point cloud in a matter of seconds (fig. 1). In order to achieve interactive speed, we exploit both efficient inference algorithms and parallel computation, often on a GPU. The system can be broken down into two distinct phases: geometry capture, and object inference. We now discuss these in further detail. © 2011 IEEE.
Resumo:
Over the past decade or so a number of changes have been observed in traditional Japanese employment relations (ERs) systems such as an increase in non-regular workers, a move towards performance-based systems and a continuous decline in union membership. There is a large body of Anglo-Saxon and Japanese literature providing evidence that national factors such as national institutions, national culture, and the business and economic environment have significantly influenced what were hitherto three ‘sacred’ aspects of Japanese ERs systems (ERSs). However, no research has been undertaken until now at the firm level regarding the extent to which changes in national factors influence ERSs across firms. This article develops a model to examine the impact of national factors on ER systems; and analyses the impact of national factors at the firm level ER systems. Based on information collected from two different groups of companies, namely Mitsubishi Chemical Group (MCG) and Federation of Shinkin Bank (FSB) the research finds that except for a few similarities, the impact of national factors is different on Japanese ER systems at the firm level. This indicates that the impact of national factors varies in the implementation of employment relations factors. In the case of MCG, national culture has less to do with seniority-based system. Study also reveals that the national culture factors have also less influence on an enterprise-based system in the case of FSB. This analysis is useful for domestic and international organizations as it helps to better understand the role of national factors in determining Japanese ERSs.
Resumo:
Recent work on ultra-long Raman fiber lasers has shown that it is possible to create quasi-lossless transmission conditions in fiber spans long enough to be considered for high speed optical communications. This paper reviews how quasi-lossless transmission conditions are reached and presents experimental results of 40Gb/s transmission in a quasi lossless system. The performance is compared with a conventional EDFA based system.
Developing a probabilistic graphical structure from a model of mental-health clinical risk expertise
Resumo:
This paper explores the process of developing a principled approach for translating a model of mental-health risk expertise into a probabilistic graphical structure. The Galatean Risk Screening Tool [1] is a psychological model for mental health risk assessment based on fuzzy sets. This paper details how the knowledge encapsulated in the psychological model was used to develop the structure of the probability graph by exploiting the semantics of the clinical expertise. These semantics are formalised by a detailed specification for an XML structure used to represent the expertise. The component parts were then mapped to equivalent probabilistic graphical structures such as Bayesian Belief Nets and Markov Random Fields to produce a composite chain graph that provides a probabilistic classification of risk expertise to complement the expert clinical judgements. © Springer-Verlag 2010.
An agent approach to improving radio frequency identification enabled Returnable Transport Equipment
Resumo:
Returnable transport equipment (RTE) such as pallets form an integral part of the supply chain and poor management leads to costly losses. Companies often address this matter by outsourcing the management of RTE to logistics service providers (LSPs). LSPs are faced with the task to provide logistical expertise to reduce RTE related waste, whilst differentiating their own services to remain competitive. In the current challenging economic climate, the role of the LSP to deliver innovative ways to achieve competitive advantage has never been so important. It is reported that radio frequency identification (RFID) application to RTE enables LSPs such as DHL to gain competitive advantage and offer clients improvements such as loss reduction, process efficiency improvement and effective security. However, the increased visibility and functionality of RFID enabled RTE requires further investigation in regards to decision‐making. The distributed nature of the RTE network favours a decentralised decision‐making format. Agents are an effective way to represent objects from the bottom‐up, capturing the behaviour and enabling localised decision‐making. Therefore, an agent based system is proposed to represent the RTE network and utilise the visibility and data gathered from RFID tags. Two types of agents are developed in order to represent the trucks and RTE, which have bespoke rules and algorithms in order to facilitate negotiations. The aim is to create schedules, which integrate RTE pick‐ups as the trucks go back to the depot. The findings assert that: - agent based modelling provides an autonomous tool, which is effective in modelling RFID enabled RTE in a decentralised utilising the real‐time data facility. ‐ the RFID enabled RTE model developed enables autonomous agent interaction, which leads to a feasible schedule integrating both forward and reverse flows for each RTE batch. ‐ the RTE agent scheduling algorithm developed promotes the utilisation of RTE by including an automatic return flow for each batch of RTE, whilst considering the fleet costs andutilisation rates. ‐ the research conducted contributes an agent based platform, which LSPs can use in order to assess the most appropriate strategies to implement for RTE network improvement for each of their clients.
Resumo:
Radio Frequency Identification (RFID) has been identified as a crucial technology for the modern 21st century knowledge-based economy. Some businesses have realised benefits of RFID adoption through improvements in operational efficiency, additional cost savings, and opportunities for higher revenues. RFID research in warehousing operations has been less prominent than in other application domains. To investigate how RFID technology has had an impact in warehousing, a comprehensive analysis of research findings available from articles through leading scientific article databases has been conducted. Articles from years 1995 to 2010 have been reviewed and analysed with respect to warehouse operations, RFID application domains, benefits achieved and obstacles encountered. Four discussion topics are presented covering RFID in warehousing focusing on its applications, perceived benefits, obstacles to its adoption and future trends. This is aimed at elucidating the current state of RFID in the warehouse and providing insights for researchers to establish new research agendas and for practitioners to consider and assess the adoption of RFID in warehousing functions. © 2013 Elsevier B.V.
Resumo:
The Electronic Patient Record (EPR) is being developed by many hospitals in the UK and across the globe. We class an EPR system as a type of Knowledge Management System (KMS), in that it is a technological tool developed to support the process of knowledge management (KM). Healthcare organisations aim to use these systems to provide a vehicle for more informed and improved clinical decision making thereby delivering reduced errors and risks, enhanced quality and consequently offering enhanced patient safety. Finding an effective way for a healthcare organisation to practically implement these systems is essential. In this study we use the concept of the business process approach to KM as a theoretical lens to analyse and explore how a large NHS teaching hospital developed, executed and practically implemented an EPR system. This theory advocates the importance of taking into account all organizational activities - the business processes - in considering any KM initiatives. Approaching KM through business processes allows for a more holistic view of the requirements across a process: emphasis is placed on how particular activities are performed, how they are structured and what knowledge demanded and not just supplied across each process. This falls in line with the increased emphasis in healthcare on patient-centred approaches to care delivery. We have found in previous research that hospitals are happy with the delivery of patient care being referred to as their 'business'. A qualitative study was conducted over a two and half year period with data collected from semi-structured interviews with eight members of the strategic management team, 12 clinical users and 20 patients in addition to non- participant observation of meetings and documentary data. We believe that the inclusion of patients within the study may well be the first time this has been done in examining the implementation of a KMS. The theoretical propositions strategy was used as the overarching approach for data analysis. Here Initial theoretical research themes and propositions were used to help shape and organise the case study analysis. This paper will present preliminary findings about the hospital's business strategy and its links to the KMS strategy and process.
Resumo:
The formal model of natural language processing in knowledge-based information systems is considered. The components realizing functions of offered formal model are described.