920 resultados para INTELLIGENCE SYSTEMS METHODOLOGY
Resumo:
This paper presents a multilayered architecture that enhances the capabilities of current QA systems and allows different types of complex questions or queries to be processed. The answers to these questions need to be gathered from factual information scattered throughout different documents. Specifically, we designed a specialized layer to process the different types of temporal questions. Complex temporal questions are first decomposed into simple questions, according to the temporal relations expressed in the original question. In the same way, the answers to the resulting simple questions are recomposed, fulfilling the temporal restrictions of the original complex question. A novel aspect of this approach resides in the decomposition which uses a minimal quantity of resources, with the final aim of obtaining a portable platform that is easily extensible to other languages. In this paper we also present a methodology for evaluation of the decomposition of the questions as well as the ability of the implemented temporal layer to perform at a multilingual level. The temporal layer was first performed for English, then evaluated and compared with: a) a general purpose QA system (F-measure 65.47% for QA plus English temporal layer vs. 38.01% for the general QA system), and b) a well-known QA system. Much better results were obtained for temporal questions with the multilayered system. This system was therefore extended to Spanish and very good results were again obtained in the evaluation (F-measure 40.36% for QA plus Spanish temporal layer vs. 22.94% for the general QA system).
Resumo:
This work addresses the optimization of ammonia–water absorption cycles for cooling and refrigeration applications with economic and environmental concerns. Our approach combines the capabilities of process simulation, multi-objective optimization (MOO), cost analysis and life cycle assessment (LCA). The optimization task is posed in mathematical terms as a multi-objective mixed-integer nonlinear program (moMINLP) that seeks to minimize the total annualized cost and environmental impact of the cycle. This moMINLP is solved by an outer-approximation strategy that iterates between primal nonlinear programming (NLP) subproblems with fixed binaries and a tailored mixed-integer linear programming (MILP) model. The capabilities of our approach are illustrated through its application to an ammonia–water absorption cycle used in cooling and refrigeration applications.
Resumo:
Computer science studies possess a strong multidisciplinary aptitude since most graduates do their professional work outside of a computing environment, in close collaboration with professionals from many different areas. However, the training offered in computer science studies lacks that multidisciplinary factor, focusing more on purely technical aspects. In this paper we present a novel experience where computer studies and educational psychology find a common ground and realistic working through laboratory practices. Specifically, the work enables students of computer science education the development of diagnosis support systems, with artificial intelligence techniques, which could then be used for future educational psychologists. The applications developed by computer science students are the creation of a model for the diagnosis of pervasive developmental disorders (PDD), sometimes also commonly called the autism spectrum disorders (ASD). The complexity of this diagnosis, not only by the exclusive characteristics of every person who suffers from it, but also by the large numbers of variables involved in it, requires very strong and close interdisciplinary participation. This work demonstrates that it is possible to intervene in a curricular perspective, in the university, to promote the development of interpersonal skills. What can be shown, in this way, is a methodology for interdisciplinary practices design and a guide for monitoring and evaluation. The results are very encouraging since we obtained significant differences in academic achievement between students who attended a course using the new methodology and those who did not use it.
Resumo:
Business Intelligence (BI) applications have been gradually ported to the Web in search of a global platform for the consumption and publication of data and services. On the Internet, apart from techniques for data/knowledge management, BI Web applications need interfaces with a high level of interoperability (similar to the traditional desktop interfaces) for the visualisation of data/knowledge. In some cases, this has been provided by Rich Internet Applications (RIA). The development of these BI RIAs is a process traditionally performed manually and, given the complexity of the final application, it is a process which might be prone to errors. The application of model-driven engineering techniques can reduce the cost of development and maintenance (in terms of time and resources) of these applications, as they demonstrated by other types of Web applications. In the light of these issues, the paper introduces the Sm4RIA-B methodology, i.e., a model-driven methodology for the development of RIA as BI Web applications. In order to overcome the limitations of RIA regarding knowledge management from the Web, this paper also presents a new RIA platform for BI, called RI@BI, which extends the functionalities of traditional RIAs by means of Semantic Web technologies and B2B techniques. Finally, we evaluate the whole approach on a case study—the development of a social network site for an enterprise project manager.
Resumo:
This paper provides a conceptual framework for the estimation of the farm labour and other factor-derived demand and output supply systems. In order to analyse the drivers of labour demand in agriculture and account for the impact of policies on those decisions, it is necessary to acknowledge the interaction between the different factor markets. For this purpose, we present a review of the theoretical background to primal and dual representations of production and some empirical literature that has made use of derived demand systems. The main focus of the empirical work is to study the effect of market distortions in one market, through inefficient pricing, on the demand for other inputs. Therefore, own-price and cross-price elasticities of demand become key variables in the analysis. The dual cost function is selected as the most appropriate approach, where input prices are assumed to be exogenous. A commonly employed specification – and one that is particularly convenient due to its flexible form – is the translog cost function. The analysis consists of estimating the system of cost-share equations, in order to obtain the derived demand functions for inputs. Thus, the elasticities of factor substitution can be used to examine the complementarity/substitutability between inputs.
Resumo:
Transportation Department, Research and Special Programs Administration, Washington, D.C.
Resumo:
Urban Mass Transportation Administration, Washington, D.C.
Resumo:
As a means of benchmarking their position and assisting with anticipating an uncertain future, the identification of critical information systems (IS) management issues frameworks is becoming an increasingly important research task for both academics and industrialists. This paper provides a description and summary of previous work on identifying IS issues frameworks by reviewing 20 research investigations in terms of what they studied and how they were conducted. It also suggests some possible directions and methodologies for future research. The summary and suggestions for further work are applicable for issues framework research in the IS management field as well as in other business and management areas.
Resumo:
Original Paper European Journal of Information Systems (2001) 10, 135–146; doi:10.1057/palgrave.ejis.3000394 Organisational learning—a critical systems thinking discipline P Panagiotidis1,3 and J S Edwards2,4 1Deloitte and Touche, Athens, Greece 2Aston Business School, Aston University, Aston Triangle, Birmingham, B4 7ET, UK Correspondence: Dr J S Edwards, Aston Business School, Aston University, Aston Triangle, Birmingham, B4 7ET, UK. E-mail: j.s.edwards@aston.ac.uk 3Petros Panagiotidis is Manager responsible for the Process and Systems Integrity Services of Deloitte and Touche in Athens, Greece. He has a BSc in Business Administration and an MSc in Management Information Systems from Western International University, Phoenix, Arizona, USA; an MSc in Business Systems Analysis and Design from City University, London, UK; and a PhD degree from Aston University, Birmingham, UK. His doctorate was in Business Systems Analysis and Design. His principal interests now are in the ERP/DSS field, where he serves as project leader and project risk managment leader in the implementation of SAP and JD Edwards/Cognos in various major clients in the telecommunications and manufacturing sectors. In addition, he is responsible for the development and application of knowledge management systems and activity-based costing systems. 4John S Edwards is Senior Lecturer in Operational Research and Systems at Aston Business School, Birmingham, UK. He holds MA and PhD degrees (in mathematics and operational research respectively) from Cambridge University. His principal research interests are in knowledge management and decision support, especially methods and processes for system development. He has written more than 30 research papers on these topics, and two books, Building Knowledge-based Systems and Decision Making with Computers, both published by Pitman. Current research work includes the effect of scale of operations on knowledge management, interfacing expert systems with simulation models, process modelling in law and legal services, and a study of the use of artifical intelligence techniques in management accounting. Top of pageAbstract This paper deals with the application of critical systems thinking in the domain of organisational learning and knowledge management. Its viewpoint is that deep organisational learning only takes place when the business systems' stakeholders reflect on their actions and thus inquire about their purpose(s) in relation to the business system and the other stakeholders they perceive to exist. This is done by reflecting both on the sources of motivation and/or deception that are contained in their purpose, and also on the sources of collective motivation and/or deception that are contained in the business system's purpose. The development of an organisational information system that captures, manages and institutionalises meaningful information—a knowledge management system—cannot be separated from organisational learning practices, since it should be the result of these very practices. Although Senge's five disciplines provide a useful starting-point in looking at organisational learning, we argue for a critical systems approach, instead of an uncritical Systems Dynamics one that concentrates only on the organisational learning practices. We proceed to outline a methodology called Business Systems Purpose Analysis (BSPA) that offers a participatory structure for team and organisational learning, upon which the stakeholders can take legitimate action that is based on the force of the better argument. In addition, the organisational learning process in BSPA leads to the development of an intrinsically motivated information organisational system that allows for the institutionalisation of the learning process itself in the form of an organisational knowledge management system. This could be a specific application, or something as wide-ranging as an Enterprise Resource Planning (ERP) implementation. Examples of the use of BSPA in two ERP implementations are presented.
Resumo:
Purpose – The international nuclear community continues to face the challenge of managing both the legacy waste and the new wastes that emerge from ongoing energy production. The UK is in the early stages of proposing a new convention for its nuclear industry, that is: waste minimisation through closely managing the radioactive source which creates the waste. This paper proposes a new technique (called waste and source material operability study (WASOP)) to qualitatively analyse a complex, waste-producing system to minimise avoidable waste and thus increase the protection to the public and the environment. Design/methodology/approach – WASOP critically considers the systemic impact of up and downstream facilities on the minimisation of nuclear waste in a facility. Based on the principles of HAZOP, the technique structures managers' thinking on the impact of mal-operations in interlinking facilities in order to identify preventative actions to reduce the impact on waste production of those mal-operations.' Findings – WASOP was tested with a small group of experienced nuclear regulators and was found to support their qualitative examination of waste minimisation and help them to work towards developing a plan of action. Originality/value – Given the newness of this convention, the wider methodology in which WASOP sits is still in development. However, this paper communicates the latest thinking from nuclear regulators on decision-making methodology for supporting waste minimisation and is hoped to form part of future regulatory guidance. WASOP is believed to have widespread potential application to the minimisation of many other forms of waste, including that from other energy sectors and household/general waste.
Resumo:
The rapid developments in computer technology have resulted in a widespread use of discrete event dynamic systems (DEDSs). This type of system is complex because it exhibits properties such as concurrency, conflict and non-determinism. It is therefore important to model and analyse such systems before implementation to ensure safe, deadlock free and optimal operation. This thesis investigates current modelling techniques and describes Petri net theory in more detail. It reviews top down, bottom up and hybrid Petri net synthesis techniques that are used to model large systems and introduces on object oriented methodology to enable modelling of larger and more complex systems. Designs obtained by this methodology are modular, easy to understand and allow re-use of designs. Control is the next logical step in the design process. This thesis reviews recent developments in control DEDSs and investigates the use of Petri nets in the design of supervisory controllers. The scheduling of exclusive use of resources is investigated and an efficient Petri net based scheduling algorithm is designed and a re-configurable controller is proposed. To enable the analysis and control of large and complex DEDSs, an object oriented C++ software tool kit was developed and used to implement a Petri net analysis tool, Petri net scheduling and control algorithms. Finally, the methodology was applied to two industrial DEDSs: a prototype can sorting machine developed by Eurotherm Controls Ltd., and a semiconductor testing plant belonging to SGS Thomson Microelectronics Ltd.
Resumo:
There is an increasing emphasis on the use of software to control safety critical plants for a wide area of applications. The importance of ensuring the correct operation of such potentially hazardous systems points to an emphasis on the verification of the system relative to a suitably secure specification. However, the process of verification is often made more complex by the concurrency and real-time considerations which are inherent in many applications. A response to this is the use of formal methods for the specification and verification of safety critical control systems. These provide a mathematical representation of a system which permits reasoning about its properties. This thesis investigates the use of the formal method Communicating Sequential Processes (CSP) for the verification of a safety critical control application. CSP is a discrete event based process algebra which has a compositional axiomatic semantics that supports verification by formal proof. The application is an industrial case study which concerns the concurrent control of a real-time high speed mechanism. It is seen from the case study that the axiomatic verification method employed is complex. It requires the user to have a relatively comprehensive understanding of the nature of the proof system and the application. By making a series of observations the thesis notes that CSP possesses the scope to support a more procedural approach to verification in the form of testing. This thesis investigates the technique of testing and proposes the method of Ideal Test Sets. By exploiting the underlying structure of the CSP semantic model it is shown that for certain processes and specifications the obligation of verification can be reduced to that of testing the specification over a finite subset of the behaviours of the process.
Resumo:
Based on recent advances in autonomic computing, we propose a methodology for the cost-effective development of self-managing systems starting from a model of the resources to be managed and using a general-purpose autonomic architecture.