844 resultados para Global Knowledge Base
Resumo:
This thesis was concerned with investigating methods of improving the IOP pulse’s potential as a measure of clinical utility. There were three principal sections to the work. 1. Optimisation of measurement and analysis of the IOP pulse. A literature review, covering the years 1960 – 2002 and other relevant scientific publications, provided a knowledge base on the IOP pulse. Initial studies investigated suitable instrumentation and measurement techniques. Fourier transformation was identified as a promising method of analysing the IOP pulse and this technique was developed. 2. Investigation of ocular and systemic variables that affect IOP pulse measurements In order to recognise clinically important changes in IOP pulse measurement, studies were performed to identify influencing factors. Fourier analysis was tested against traditional parameters in order to assess its ability to detect differences in IOP pulse. In addition, it had been speculated that the waveform components of the IOP pulse contained vascular characteristic analogous to those components found in arterial pulse waves. Validation studies to test this hypothesis were attempted. 3. The nature of the intraocular pressure pulse in health and disease and its relation to systemic cardiovascular variables. Fourier analysis and traditional parameters were applied to the IOP pulse measurements taken on diseased and healthy eyes. Only the derived parameter, pulsatile ocular blood flow (POBF) detected differences in diseased groups. The use of an ocular pressure-volume relationship may have improved the POBF measure’s variance in comparison to the measurement of the pulse’s amplitude or Fourier components. Finally, the importance of the driving force of pulsatile blood flow, the arterial pressure pulse, is highlighted. A method of combining the measurements of pulsatile blood flow and pulsatile blood pressure to create a measure of ocular vascular impedance is described along with its advantages for future studies.
Resumo:
Swarm intelligence is a popular paradigm for algorithm design. Frequently drawing inspiration from natural systems, it assigns simple rules to a set of agents with the aim that, through local interactions, they collectively solve some global problem. Current variants of a popular swarm based optimization algorithm, particle swarm optimization (PSO), are investigated with a focus on premature convergence. A novel variant, dispersive PSO, is proposed to address this problem and is shown to lead to increased robustness and performance compared to current PSO algorithms. A nature inspired decentralised multi-agent algorithm is proposed to solve a constrained problem of distributed task allocation. Agents must collect and process the mail batches, without global knowledge of their environment or communication between agents. New rules for specialisation are proposed and are shown to exhibit improved eciency and exibility compared to existing ones. These new rules are compared with a market based approach to agent control. The eciency (average number of tasks performed), the exibility (ability to react to changes in the environment), and the sensitivity to load (ability to cope with differing demands) are investigated in both static and dynamic environments. A hybrid algorithm combining both approaches, is shown to exhibit improved eciency and robustness. Evolutionary algorithms are employed, both to optimize parameters and to allow the various rules to evolve and compete. We also observe extinction and speciation. In order to interpret algorithm performance we analyse the causes of eciency loss, derive theoretical upper bounds for the eciency, as well as a complete theoretical description of a non-trivial case, and compare these with the experimental results. Motivated by this work we introduce agent "memory" (the possibility for agents to develop preferences for certain cities) and show that not only does it lead to emergent cooperation between agents, but also to a signicant increase in efficiency.
Resumo:
Purpose: This paper reviews current literature and contributes a set of findings that capture the current state-of-the-art of the topic of green production. Design/methodology/approach: A literature review to capture, classify and summarize the main body of knowledge on green production and, translate this into a form that is readily accessible to researchers and practitioners in the more mainstream operations management community. Findings: The existing knowledge base is somewhat fragmented. This is a relatively unexplored topic within mainstream operations management research and one which could provide rich opportunities for further exploration. Originality/value: This paper sets out to review current literature, from a more conventional production operations perspective, and contributes a set of findings that capture the current state-of-the-art of this topic.
Resumo:
A nature inspired decentralised multi-agent algorithm is proposed to solve a problem of distributed task allocation in which cities produce and store batches of different mail types. Agents must collect and process the mail batches, without global knowledge of their environment or communication between agents. The problem is constrained so that agents are penalised for switching mail types. When an agent process a mail batch of different type to the previous one, it must undergo a change-over, with repeated change-overs rendering the agent inactive. The efficiency (average amount of mail retrieved), and the flexibility (ability of the agents to react to changes in the environment) are investigated both in static and dynamic environments and with respect to sudden changes. New rules for mail selection and specialisation are proposed and are shown to exhibit improved efficiency and flexibility compared to existing ones. We employ a evolutionary algorithm which allows the various rules to evolve and compete. Apart from obtaining optimised parameters for the various rules for any environment, we also observe extinction and speciation.
Resumo:
Purpose: The servitization of manufacturing is a diverse and complex field of research interest. The purpose of this paper is to provide an integrative and organising lens for viewing the various contributions to knowledge production from those research communities addressing servitization. To achieve this, the paper aims to set out to address two principal questions, namely where are the knowledge stocks and flows amongst the research communities? And what are generic research concerns being addressed by these communities? Design/methodology/approach: Using an evidenced-based approach, the authors have performed a systematic review of the research literature associated with the servitization of manufacturing. This investigation incorporates a descriptive and thematic analysis of 148 academic and scholarly papers from 103 different lead authors in 68 international peer-reviewed journals. Findings: The work proposes support for the existence of distinct researcher communities, namely services marketing, service management, operations management, product-service systems and service science management and engineering, which are contributing to knowledge production in the servitization of manufacturing. Knowledge stocks within all communities associated with research in the servitization of manufacturing have dramatically increased since the mid-1990s. The trends clearly reveal that the operations community is in receipt of the majority of citations relating to the servitization of manufacturing. In terms of knowledge flows, it is apparent that the more mature communities are drawing on more locally produced knowledge stocks, whereas the emergent communities are drawing on a knowledge base more evenly distributed across all the communities. The results are indicative of varying degrees of interdependency amongst the communities. The generic research concerns being addressed within the communities are associated with the concepts of product-service differentiation, competitive strategy, customer value, customer relationships and product-service configuration. Originality/value: This research has further developed and articulated the identities of distinct researcher communities actively contributing to knowledge production in the servitization of manufacturing, and to what extent they are pursuing common research agendas. This study provides an improved descriptive and thematic awareness of the resulting body of knowledge, allowing the field of servitization to progress in a more informed and multidisciplinary fashion. © Emerald Group Publishing Limited.
Resumo:
Uncertainty can be defined as the difference between information that is represented in an executing system and the information that is both measurable and available about the system at a certain point in its life-time. A software system can be exposed to multiple sources of uncertainty produced by, for example, ambiguous requirements and unpredictable execution environments. A runtime model is a dynamic knowledge base that abstracts useful information about the system, its operational context and the extent to which the system meets its stakeholders' needs. A software system can successfully operate in multiple dynamic contexts by using runtime models that augment information available at design-time with information monitored at runtime. This chapter explores the role of runtime models as a means to cope with uncertainty. To this end, we introduce a well-suited terminology about models, runtime models and uncertainty and present a state-of-the-art summary on model-based techniques for addressing uncertainty both at development- and runtime. Using a case study about robot systems we discuss how current techniques and the MAPE-K loop can be used together to tackle uncertainty. Furthermore, we propose possible extensions of the MAPE-K loop architecture with runtime models to further handle uncertainty at runtime. The chapter concludes by identifying key challenges, and enabling technologies for using runtime models to address uncertainty, and also identifies closely related research communities that can foster ideas for resolving the challenges raised. © 2014 Springer International Publishing.
Resumo:
This project is focused on exchanging knowledge between ABS, UKBI and managers of business incubators in the UK. The project relates to exploitation of extant knowledge-base on assessing and improving business incubation management practice and performance and builds on two earlier studies. It addresses a pressing need for assessing and benchmarking business incubation input, process and outcome performance and highlighting best practice. The overarching aim of this project was to obtain proof-of-concept for a business incubation performance assessment and benchmarking online tool, fine-tune it and put it in use by nurturing a community of business incubation management practice, aligned by the resultant tool. The purpose was to offer an appropriate set of measures, in areas identified by relevant research on business incubation performance management and impact as critical, against which: 1.The input and process performance of business incubation management practice can be assessed and benchmarked within the auspices of a community of incubator managers concerned with best practice 2.The outcome performance and impact of business incubators can be assessed longitudinally. As such, the developed online assessment framework is geared towards the needs of researchers, policy makers and practitioners concerned with business incubation performance, added value and impact.
Resumo:
The copyright industries — such as music, film, software and publishing — occupy a significant and growing share of economic activity. Current copyright law protects the creator for up to 70 years after their death, significantly longer than patent protection (20 years after invention). Copyright law aims to balance the incentive to create new work against the costs associated with high prices and restricted access to this work. This paper reviews the economic issues behind copyright and how these are challenged by changes in technology and market structure. While economics provides a powerful conceptual framework for understanding the trade-offs involved, the paper argues that our empirical knowledge base is very weak. Much more empirical analysis is needed to understand the impacts of changes to copyright legislation. Without such analysis, policy and legal debates will continue to be based largely on anecdote and rhetoric.
Resumo:
Introduction-The pace of structural change in the UK health economies, the new focus on regulation and the breaking down of professional boundaries means that the Royal Pharmaceutical Society of Great Britain (RPSGB) has to continually review the scope, range and outputs of education provided by schools of pharmacy (SOPs). In SOPs, the focus is on equipping students with the knowledge, skills and attitudes necessary to successfully engage with the pre-registration year. The aim of this study [1] was to map current programmes and undergraduate experiences to inform the RPSGB debate. The specific objectives of this paper are to describe elements of the survey of final year undergraduates, to explore student opinions and experiences of their workload, teaching, learning and assessment. Material and methods-The three main research techniques were: (1) quantitative course document review, (2) qualitative staff interview and (3) quantitative student self completion survey. The questions in the survey were based on findings from exploratory focus group work with BPSA (British Pharmaceutical Students’ Association) members and were designed to ascertain if views expressed in the focus groups on the volume and format of assessments were held by the general student cohort. The student self completion questionnaire consisting of 31 questions, was administered in 2005 to all (n=1847) final year undergraduates, using a pragmatic mixture of methods. The sample was 15 SOPs within the UK (1 SOP opted out). The total response rate was 50.62% (n=935): it varied by SOP from 14.42% to 84.62%. The survey data were analysed (n=741) using SPSS, excluding non-UK students who may have undertaken part of their studies within a non-UK university. Results and discussion • 76% (n=562) respondents considered that the amount of formal assessment was about right, 21% (n=158) thought it was too much. • There was agreement that the MPharm seems to have more assessment than other courses, with 63% (n=463) strongly agreeing or agreeing. • The majority considered the balance between examinations and coursework was about right (67%, n=498), with 27% (n=198) agreeing that the balance was too far weighted towards examinations. • 57% (n=421) agreed that the focus of MPharm assessment was too much towards memorised knowledge, 40% (n=290) that it was about right. • 78% (n= 575) agreed with the statement “Assessments don’t measure the skills for being a pharmacist they just measure your knowledge base”. Only 10% (n=77) disagreed. • Similarly 49% (n=358) disagreed with, and 35% (n=256) were not sure about the statement “I consider that the assessments used in the MPharm course adequately measure the skills necessary to be a pharmacist”. Only 17% (n=124) agreed. Experience from this study shows the difficulty of administering survey instruments through UK Schools of Pharmacy. It is heavily dependent on timing, goodwill and finding the right person. The variability of the response rate between SOPs precluded any detailed analysis by School. Nevertheless, there are some interesting results. Issues raised in the exploratory focus group work about amount of assessment and over reliance on knowledge have been confirmed. There is a real debate to be had about the extent to which the undergraduate course, which must instil scientific knowledge, can provide students with the requisite qualities, skills, attitudes and behaviour that are more easily acquired in the pre-registration year. References [1] Wilson K, Jesson J, Langley C, Clarke L, Hatfield K. MPharm Programmes: Where are we now? Report commissioned by the Pharmacy Practice Research Trust., 2005.
Resumo:
This paper draws on contributions to and discussions at a recent MRC HSRC-sponsored workshop 'Researching users' experiences of health care: the case of cancer'. We focus on the methodological and ethical challenges that currently face researchers who use self-report methods to investigate experiences of cancer and cancer care. These challenges relate to: the theoretical and conceptual underpinnings of research; participation rates and participant profiles; data collection methods (the retrospective nature of accounts, description and measurement, and data collection as intervention); social desirability considerations; relationship considerations; the experiences of contributing to research; and the synthesis and presentation of findings. We suggest that methodological research to tackle these challenges should be integrated into substantive research projects to promote the development of a strong knowledge base about experiences of cancer and cancer care.
Resumo:
The principles of design of information-analytical system (IAS) intended for design of new inorganic compounds are considered. IAS includes the integrated system of databases on properties of inorganic substances and materials, the system of the programs of pattern recognition, the knowledge base and managing program. IAS allows a prediction of inorganic compounds not yet synthesized and estimation of their some properties.
Resumo:
One of the most important problems of e-learning system is studied in given paper. This problem is building of data domain model. Data domain model is based on usage of correct organizing knowledge base. In this paper production-frame model is offered, which allows structuring data domain and building flexible and understandable inference system, residing in production system.
Resumo:
The article proposes the model of management of information about program flow analysis for conducting computer experiments with program transformations. It considers the architecture and context of the flow analysis subsystem within the framework of Specialized Knowledge Bank on Program Transformations and describes the language for presenting flow analysis methods in the knowledge bank.
Resumo:
The article presents a new method to estimating usability of a user interface based on its model. The principal features of the method are: creation of an expandable knowledge base of usability defects, detection defects based on the interface model, within the design phase, and information to the developer not only about existence of defects but also advice on their elimination.
Resumo:
The problems of formalization of the process of matching different management subjects’ functioning characteristics obtained on the financial flows analysis basis is considered. Formal generalizations for gaining economical security system knowledge bases elements are presented. One of feedback directions establishment between knowledge base of the system of economical security and financial flows database analysis is substantiated.