61 resultados para behavior-based systems
Resumo:
Original Paper European Journal of Information Systems (2001) 10, 135–146; doi:10.1057/palgrave.ejis.3000394 Organisational learning—a critical systems thinking discipline P Panagiotidis1,3 and J S Edwards2,4 1Deloitte and Touche, Athens, Greece 2Aston Business School, Aston University, Aston Triangle, Birmingham, B4 7ET, UK Correspondence: Dr J S Edwards, Aston Business School, Aston University, Aston Triangle, Birmingham, B4 7ET, UK. E-mail: j.s.edwards@aston.ac.uk 3Petros Panagiotidis is Manager responsible for the Process and Systems Integrity Services of Deloitte and Touche in Athens, Greece. He has a BSc in Business Administration and an MSc in Management Information Systems from Western International University, Phoenix, Arizona, USA; an MSc in Business Systems Analysis and Design from City University, London, UK; and a PhD degree from Aston University, Birmingham, UK. His doctorate was in Business Systems Analysis and Design. His principal interests now are in the ERP/DSS field, where he serves as project leader and project risk managment leader in the implementation of SAP and JD Edwards/Cognos in various major clients in the telecommunications and manufacturing sectors. In addition, he is responsible for the development and application of knowledge management systems and activity-based costing systems. 4John S Edwards is Senior Lecturer in Operational Research and Systems at Aston Business School, Birmingham, UK. He holds MA and PhD degrees (in mathematics and operational research respectively) from Cambridge University. His principal research interests are in knowledge management and decision support, especially methods and processes for system development. He has written more than 30 research papers on these topics, and two books, Building Knowledge-based Systems and Decision Making with Computers, both published by Pitman. Current research work includes the effect of scale of operations on knowledge management, interfacing expert systems with simulation models, process modelling in law and legal services, and a study of the use of artifical intelligence techniques in management accounting. Top of pageAbstract This paper deals with the application of critical systems thinking in the domain of organisational learning and knowledge management. Its viewpoint is that deep organisational learning only takes place when the business systems' stakeholders reflect on their actions and thus inquire about their purpose(s) in relation to the business system and the other stakeholders they perceive to exist. This is done by reflecting both on the sources of motivation and/or deception that are contained in their purpose, and also on the sources of collective motivation and/or deception that are contained in the business system's purpose. The development of an organisational information system that captures, manages and institutionalises meaningful information—a knowledge management system—cannot be separated from organisational learning practices, since it should be the result of these very practices. Although Senge's five disciplines provide a useful starting-point in looking at organisational learning, we argue for a critical systems approach, instead of an uncritical Systems Dynamics one that concentrates only on the organisational learning practices. We proceed to outline a methodology called Business Systems Purpose Analysis (BSPA) that offers a participatory structure for team and organisational learning, upon which the stakeholders can take legitimate action that is based on the force of the better argument. In addition, the organisational learning process in BSPA leads to the development of an intrinsically motivated information organisational system that allows for the institutionalisation of the learning process itself in the form of an organisational knowledge management system. This could be a specific application, or something as wide-ranging as an Enterprise Resource Planning (ERP) implementation. Examples of the use of BSPA in two ERP implementations are presented.
Resumo:
Vaccination remains a key tool in the protection and eradication of diseases. However, the development of new safe and effective vaccines is not easy. Various live organism based vaccines currently licensed, exhibit high efficacy; however, this benefit is associated with risk, due to the adverse reactions found with these vaccines. Therefore, in the development of vaccines, the associated risk-benefit issues need to be addressed. Sub-unit proteins offer a much safer alternative; however, their efficacy is low. The use of adjuvanted systems have proven to enhance the immunogenicity of these sub-unit vaccines through protection (i.e. preventing degradation of the antigen in vivo) and enhanced targeting of these antigens to professional antigen-presenting cells. Understanding of the immunological implications of the related disease will enable validation for the design and development of potential adjuvant systems. Novel adjuvant research involves the combination of both pharmaceutical analysis accompanied by detailed immunological investigations, whereby, pharmaceutically designed adjuvants are driven by an increased understanding of mechanisms of adjuvant activity, largely facilitated by description of highly specific innate immune recognition of components usually associated with the presence of invading bacteria or virus. The majority of pharmaceutical based adjuvants currently being investigated are particulate based delivery systems, such as liposome formulations. As an adjuvant, liposomes have been shown to enhance immunity against the associated disease particularly when a cationic lipid is used within the formulation. In addition, the inclusion of components such as immunomodulators, further enhance immunity. Within this review, the use and application of effective adjuvants is investigated, with particular emphasis on liposomal-based systems. The mechanisms of adjuvant activity, analysis of complex immunological characteristics and formulation and delivery of these vaccines are considered.
Resumo:
Since much knowledge is tacit, eliciting knowledge is a common bottleneck during the development of knowledge-based systems. Visual interactive simulation (VIS) has been proposed as a means for eliciting experts’ decision-making by getting them to interact with a visual simulation of the real system in which they work. In order to explore the effectiveness and efficiency of VIS based knowledge elicitation, an experiment has been carried out with decision-makers in a Ford Motor Company engine assembly plant. The model properties under investigation were the level of visual representation (2-dimensional, 2½-dimensional and 3-dimensional) and the model parameter settings (unadjusted and adjusted to represent more uncommon and extreme situations). The conclusion from the experiment is that using a 2-dimensional representation with adjusted parameter settings provides the better simulation-based means for eliciting knowledge, at least for the case modelled.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
This research primarily focused on identifying the formulation parameters which control the efficacy of liposomes as delivery systems to enhance the delivery of poorly soluble drugs. Preliminary studies focused on the drug loading of ibuprofen within vesicle systems. Initially both liposomal and niosomal formulations were screened for their drug-loading capacity: liposomal systems were shown to offer significantly higher ibuprofen loading and thereafter lipid based systems were further investigated. Given the key role cholesterol is known to play within the stability of bilayer vesicles. the optimum cholesterol content in terms of drug loading and release of poorly soluble drugs was then investigated. From these studies a concentration of 11 total molar % of cholesterol was used as a benchmark for all further formulations. Investigating the effect of liposomc composition on several low solubility drugs, drug loading was shown to be enhanced by adopting longer chain length lipids. cationic lipids and. decreasing drug molecular weight. Drug release was increased by using cationic lipids and lower molecular weight of drug; conversely, a reduction was noted when employing longer chain lipids thus supporting the rational of longer chain lipids producing more stable liposomes, a theory also supported by results obtained via Langmuir studies· although it was revealed that stability is also dependent on geometric features associated with the lipid chain moiety. Interestingly, reduction in drug loading appeared to be induced when symmetrical phospholipids were substituted for lipids constituting asymmetrical alkyl chain groups thus further highlighting the importance of lipid geometry. Combining a symmetrical lipid with an asymmetrical derivative enhanced encapsulation of a hydrophobic drug while reducing that of another suggesting the importance of drug characteristics. Phosphatidylcholine liposornes could successfully be prepared (and visualised using transmission electron microscopy) from fatty alcohols therefore offering an alternative liposomal stabiliser to cholesterol. Results obtained revealed that liposomes containing tetradecanol within their formulation shares similar vesicle size, drug encapsulation, surface charge. and toxicity profiles as liposomes formulated with cholesterol, however the tetradecanol preparation appeared to release considerably more drug during stability studies. Langmuir monolayer studies revealed that the condensing influence by tetradecanol is less than compared with cholesterol suggesting that this reduced intercalation by the former could explain why the tetradecanol formulation released more drug compared with cholesterol formulations. Environmental scanning electron microscopy (ESEM) was used to analyse the morphology and stability of liposomes. These investigations indicated that the presence of drugs within the liposomal bilayer were able to enhance the stability of the bilayers against collapse under reduced hydration conditions. In addition the presence of charged lipids within the formulation under reduced hydration conditions compared with its neutral counterpart. However the applicability of using ESEM as a new method to investigate liposome stability appears less valid than first hoped since the results are often open to varied interpretation and do not provide a robust set of data to support conclusions in some cases.
Resumo:
Despite expectations being high, the industrial take-up of Semantic Web technologies in developing services and applications has been slower than expected. One of the main reasons is that many legacy systems have been developed without considering the potential of theWeb in integrating services and sharing resources.Without a systematic methodology and proper tool support, the migration from legacy systems to SemanticWeb Service-based systems can be a tedious and expensive process, which carries a significant risk of failure. There is an urgent need to provide strategies, allowing the migration of legacy systems to Semantic Web Services platforms, and also tools to support such strategies. In this paper we propose a methodology and its tool support for transitioning these applications to Semantic Web Services, which allow users to migrate their applications to Semantic Web Services platforms automatically or semi-automatically. The transition of the GATE system is used as a case study. © 2009 - IOS Press and the authors. All rights reserved.
Resumo:
The paper discusses both the complementary factors and contradictions of adoption ERP based systems with enterprise 2.0. ERP is well known as its' efficient business process management. Also the high failure rate the system implementation is famous as well. According to [1], ERP systems could achieve efficient business performance by enabling a standardized business process design, but at a cost of flexibility in operations. However, enterprise 2.0 supports flexible business process management, informal and less structured interactions [3],[4],[21]. Traditional researcher claimed efficiency and flexibility may seem incompatible in that they are different business objectives and may exist in different organizational environments. However, the paper will break traditional norms that combine ERP and enterprise 2.0 in a single enterprise to improve both efficient and flexible operations simultaneously. Based on the multiple cases studies, four cases presented different attitudes on usage ERP systems and enterprise social systems. Based on socio-technical theory, the paper presents in-depth analysis benefits of combination ERP with enterprise 2.0 for these firms.
Resumo:
Over the past decade or so a number of changes have been observed in traditional Japanese employment relations (ERs) systems such as an increase in non-regular workers, a move towards performance-based systems and a continuous decline in union membership. There is a large body of Anglo-Saxon and Japanese literature providing evidence that national factors such as national institutions, national culture, and the business and economic environment have significantly influenced what were hitherto three ‘sacred’ aspects of Japanese ERs systems (ERSs). However, no research has been undertaken until now at the firm level regarding the extent to which changes in national factors influence ERSs across firms. This article develops a model to examine the impact of national factors on ER systems; and analyses the impact of national factors at the firm level ER systems. Based on information collected from two different groups of companies, namely Mitsubishi Chemical Group (MCG) and Federation of Shinkin Bank (FSB) the research finds that except for a few similarities, the impact of national factors is different on Japanese ER systems at the firm level. This indicates that the impact of national factors varies in the implementation of employment relations factors. In the case of MCG, national culture has less to do with seniority-based system. Study also reveals that the national culture factors have also less influence on an enterprise-based system in the case of FSB. This analysis is useful for domestic and international organizations as it helps to better understand the role of national factors in determining Japanese ERSs.
Resumo:
The project consists of an experimental and numerical modelling study of the applications of ultra-long Raman fibre laser (URFL) based amplification techniques for high-speed multi-wavelength optical communications systems. The research is focused in telecommunications C-band 40 Gb/s transmission data rates with direct and coherent detection. The optical transmission performance of URFL based systems in terms of optical noise, gain bandwidth and gain flatness for different system configurations is evaluated. Systems with different overall span lengths, transmission fibre types and data modulation formats are investigated. Performance is compared with conventional Erbium doped fibre amplifier based system to evaluate system configurations where URFL based amplification provide performance or commercial advantages.
Resumo:
The chapter discusses both the complementary factors and contradictions of adoption ERP-based systems with Enterprise 2.0. ERP is well known as IT's efficient business process management. Enterprise 2.0 supports flexible business process management, informal, and less structured interactions. Traditional studies indicate efficiency and flexibility may seem incompatible because they are different business objectives and may exist in different organizational environments. However, the chapter breaks traditional norms that combine ERP and Enterprise 2.0 in a single enterprise to improve both efficient and flexible operations simultaneously. Based on multiple case studies, the chapter analyzes the benefits and risks of the combination of ERP with Enterprise 2.0 from process, organization, and people paradigms. © 2013 by IGI Global.
Resumo:
Conventional tools for measurement of laser spectra (e.g. optical spectrum analysers) capture data averaged over a considerable time period. However, the generation spectrum of many laser types may involve spectral dynamics whose relatively fast time scale is determined by their cavity round trip period, calling for instrumentation featuring both high temporal and spectral resolution. Such real-time spectral characterisation becomes particularly challenging if the laser pulses are long, or they have continuous or quasi-continuous wave radiation components. Here we combine optical heterodyning with a technique of spatiooral intensity measurements that allows the characterisation of such complex sources. Fast, round-trip-resolved spectral dynamics of cavity-based systems in real-time are obtained, with temporal resolution of one cavity round trip and frequency resolution defined by its inverse (85 ns and 24 MHz respectively are demonstrated). We also show how under certain conditions for quasi-continuous wave sources, the spectral resolution could be further increased by a factor of 100 by direct extraction of phase information from the heterodyned dynamics or by using double time scales within the spectrogram approach.
Resumo:
The chapter discusses both the complementary factors and contradictions of adopting ERP based systems with enterprise 2.0. ERP is characterized as achieving efficient business performance by enabling a standardized business process design, but at a cost of flexibility in operations. It is claimed that enterprise 2.0 can support flexible business process management and so incorporate informal and less structured interactions. A traditional view however is that efficiency and flexibility objectives are incompatible as they are different business objectives which are pursued separately in different organizational environments. Thus an ERP system with a primary objective of improving efficiency and an enterprise 2.0 system with a primary aim of improving flexibility may represent a contradiction and lead to a high risk of failure if adopted simultaneously. This chapter will use case study analysis to investigate the use of a combination of ERP and enterprise 2.0 in a single enterprise with the aim of improving both efficiency and flexibility in operations. The chapter provides an in-depth analysis of the combination of ERP with enterprise 2.0 based on social-technical information systems management theory. The chapter also provides a summary of the benefits of the combination of ERP systems and enterprise 2.0 and how they could contribute to the development of a new generation of business management that combines both formal and informal mechanisms. For example, the multiple-sites or informal communities of an enterprise could collaborate efficiently with a common platform with a certain level of standardization but also have the flexibility in order to provide an agile reaction to internal and external events.
Resumo:
A plethora of techniques for the imaging of liposomes and other bilayer vesicles are available. However, sample preparation and the technique chosen should be carefully considered in conjunction with the information required. For example, larger vesicles such as multilamellar and giant unilamellar vesicles can be viewed using light microscopy and whilst vesicle confirmation and size prior to additional physical characterisations or more detailed microscopy can be undertaken, the technique is limited in terms of resolution. To consider the options available for visualising liposome-based systems, a wide range of microscopy techniques are described and discussed here: these include light, fluorescence and confocal microscopy and various electron microscopy techniques such as transmission, cryo, freeze fracture and environmental scanning electron microscopy. Their application, advantages and disadvantages are reviewed with regard to their use in analysis of lipid vesicles.
Resumo:
This second issue of Knowledge Management Research & Practice (KMRP) continues the international nature of the first issue, with papers from authors based on four different continents. There are five regular papers, plus the first of what is intended to be an occasional series of 'position papers' from respected figures in the knowledge management field, who have specific issues they wish to raise from a personal standpoint. The first two regular papers are both based on case studies. The first is 'Aggressively pursuing knowledge management over two years: a case study a US government organization' by Jay Liebowitz. Liebowitz is well known to both academics and practictioners as an author on knowledge management and knowledge based systems. Government departments in many Western countries must soon face up to the problems that will occur as the 'baby boomer' generation reaches retirement age over the next decade. This paper describes how one particular US government organization has attempted to address this situation (and others) through the introduction of a knowledge management initiative. The second case study paper is 'Knowledge creation through the synthesizing capability of networked strategic communities: case study on new product development in Japan' by Mitsuru Kodama. This paper looks at the importance of strategic communities - communities that have strategic relevance and support - in knowledge management. Here, the case study organization is Nippon Telegraph and Telephone Corporation (NTT), a Japanese telecommunication firm. The third paper is 'Knowledge management and intellectual capital: an empirical examination of current practice in Australia' by Albert Zhou and Dieter Fink. This paper reports the results of a survey carried out in 2001, exploring the practices relating to knowledge management and intellectual capital in Australia and the relationship between them. The remaining two regular papers are conceptual in nature. The fourth is 'The enterprise knowledge dictionary' by Stuart Galup, Ronald Dattero and Richard Hicks. Galup, Dattero and Hicks propose the concept of an enterprise knowledge dictionary and its associated knowledge management system architecture as offering the appropriate form of information technology to support various different types of knowledge sources, while behaving as a single source from the user's viewpoint. The fifth and final regular paper is 'Community of practice and metacapabilities' by Geri Furlong and Leslie Johnson. This paper looks at the role of communities of practice in learning in organizations. Its emphasis is on metacapabilities - the properties required to learn, develop and apply skills. This discussion takes work on learning and core competences to a higher level. Finally, this issue includes a position paper 'Innovation as an objective of knowledge management. Part I: the landscape of management' by Dave Snowden. Snowden has been highly visible in the knowledge management community thanks to his role as the Director of IBM Global Services' Canolfan Cynefin Centre. He has helped many government and private sector organizations to consider their knowledge management problems and strategies. This, the first of two-part paper, is inspired by the notion of complexity. In it, Snowden calls for what he sees as a 20th century emphasis on designed systems for knowledge management to be consigned to history, and replaced by a 21st century emphasis on emergence. Letters to the editor on this, or any other topic related to knowledge management research and practice, are welcome. We trust that you will find the contributions stimulating, and again invite you to contribute your own paper(s) to future issues of KMRP.
Resumo:
Knowledge elicitation is a well-known bottleneck in the production of knowledge-based systems (KBS). Past research has shown that visual interactive simulation (VIS) could effectively be used to elicit episodic knowledge that is appropriate for machine learning purposes, with a view to building a KBS. Nonetheless, the VIS-based elicitation process still has much room for improvement. Based in the Ford Dagenham Engine Assembly Plant, a research project is being undertaken to investigate the individual/joint effects of visual display level and mode of problem case generation on the elicitation process. This paper looks at the methodology employed and some issues that have been encountered to date. Copyright © 2007 Inderscience Enterprises Ltd.