898 resultados para 280201 Expert Systems
Resumo:
This paper presents a Genetic Algorithms (GA) approach to resolve traffic conflicts at a railway junction. The formulation of the problem for the suitable application of GA will be discussed and three neighborhoods have been proposed for generation evolution. The performance of the GA is evaluated by computer simulation. This study paves the way for more applications of artificial intelligence techniques on a rather conservative industry.
Resumo:
This study investigates the application of local search methods on the railway junction traffic conflict-resolution problem, with the objective of attaining a quick and reasonable solution. A procedure based on local search relies on finding a better solution than the current one by a search in the neighbourhood of the current one. The structure of neighbourhood is therefore very important to an efficient local search procedure. In this paper, the formulation of the structure of the solution, which is the right-of-way sequence assignment, is first described. Two new neighbourhood definitions are then proposed and the performance of the corresponding local search procedures is evaluated by simulation. It has been shown that they provide similar results but they can be used to handle different traffic conditions and system requirements.
Resumo:
This paper proposes a train movement model with fixed runtime that can be employed to find feasible control strategies for a single train along an inter-city railway line. The objective of the model is to minimize arrival delays at each station along railway lines. However, train movement is a typical nonlinear problem for complex running environments and different requirements. A heuristic algorithm is developed to solve the problem in this paper and the simulation results show that the train could overcome the disturbance from train delay and coordinates the operation strategies to sure punctual arrival of trains at the destination. The developed algorithm can also be used to evaluate the running reliability of trains in scheduled timetables.
Resumo:
Business process models are becoming available in large numbers due to their popular use in many industrial applications such as enterprise and quality engineering projects. On the one hand, this raises a challenge as to their proper management: How can it be ensured that the proper process model is always available to the interested stakeholder? On the other hand, the richness of a large set of process models also offers opportunities, for example with respect to the re-use of existing model parts for new models. This paper describes the functionalities and architecture of an advanced process model repository, named APROMORE. This tool brings together a rich set of features for the analysis, management and usage of large sets of process models, drawing from state-of-the art research in the field of process modeling. A prototype of the platform is presented in this paper, demonstrating its feasibility, as well as an outlook on the further development of APROMORE.
Resumo:
Technology-mediated collaboration process has been extensively studied for over a decade. Most applications with collaboration concepts reported in the literature focus on enhancing efficiency and effectiveness of the decision-making processes in objective and well-structured workflows. However, relatively few previous studies have investigated the applications of collaboration schemes to problems with subjective and unstructured nature. In this paper, we explore a new intelligent collaboration scheme for fashion design which, by nature, relies heavily on human judgment and creativity. Techniques such as multicriteria decision making, fuzzy logic, and artificial neural network (ANN) models are employed. Industrial data sets are used for the analysis. Our experimental results suggest that the proposed scheme exhibits significant improvement over the traditional method in terms of the time–cost effectiveness, and a company interview with design professionals has confirmed its effectiveness and significance.
Resumo:
In open railway markets, coordinating train schedules at an interchange station requires negotiation between two independent train operating companies to resolve their operational conflicts. This paper models the stakeholders as software agents and proposes an agent negotiation model to study their interaction. Three negotiation strategies have been devised to represent the possible objectives of the stakeholders, and they determine the behavior in proposing offers to the proponent. Empirical simulation results confirm that the use of the proposed negotiation strategies lead to outcomes that are consistent with the objectives of the stakeholders.
Resumo:
This paper presents a fault diagnosis method based on adaptive neuro-fuzzy inference system (ANFIS) in combination with decision trees. Classification and regression tree (CART) which is one of the decision tree methods is used as a feature selection procedure to select pertinent features from data set. The crisp rules obtained from the decision tree are then converted to fuzzy if-then rules that are employed to identify the structure of ANFIS classifier. The hybrid of back-propagation and least squares algorithm are utilized to tune the parameters of the membership functions. In order to evaluate the proposed algorithm, the data sets obtained from vibration signals and current signals of the induction motors are used. The results indicate that the CART–ANFIS model has potential for fault diagnosis of induction motors.
Resumo:
Railway timetabling is an important process in train service provision as it matches the transportation demand with the infrastructure capacity while customer satisfaction is also considered. It is a multi-objective optimisation problem, in which a feasible solution, rather than the optimal one, is usually taken in practice because of the time constraint. The quality of services may suffer as a result. In a railway open market, timetabling usually involves rounds of negotiations among a number of self-interested and independent stakeholders and hence additional objectives and constraints are imposed on the timetabling problem. While the requirements of all stakeholders are taken into consideration simultaneously, the computation demand is inevitably immense. Intelligent solution-searching techniques provide a possible solution. This paper attempts to employ a particle swarm optimisation (PSO) approach to devise a railway timetable in an open market. The suitability and performance of PSO are studied on a multi-agent-based railway open-market negotiation simulation platform.
Resumo:
The dynamic capabilities view (DCV) focuses on renewal of firms’ strategic knowledge resources so as to sustain competitive advantage within turbulent markets. Within the context of the DCV, the focus of knowledge management (KM) is to develop the KMC through deploying knowledge governance mechanisms that are conducive to facilitating knowledge processes so as to produce superior business performance over time. The essence of KM performance evaluation is to assess how well the KMC is configured with knowledge governance mechanisms and processes that enable a firm to achieve superior performance through matching its knowledge base with market needs. However, little research has been undertaken to evaluate KM performance from the DCV perspective. This study employed a survey study design and adopted hypothesis-testing approaches to develop a capability-based KM evaluation framework (CKMEF) that upholds the basic assertions of the DCV. Under the governance of the framework, a KM index (KMI) and a KM maturity model (KMMM) were derived not only to indicate the extent to which a firm’s KM implementations fulfill its strategic objectives, and to identify the evolutionary phase of its KMC, but also to bench-mark the KMC in the research population. The research design ensured that the evaluation framework and instruments have statistical significance and good generalizabilty to be applied in the research population, namely construction firms operating in the dynamic Hong Kong construction market. The study demonstrated the feasibility of quantitatively evaluating the development of the KMC and revealing the performance heterogeneity associated with the development.
Resumo:
In condition-based maintenance (CBM), effective diagnostic and prognostic tools are essential for maintenance engineers to identify imminent fault and predict the remaining useful life before the components finally fail. This enables remedial actions to be taken in advance and reschedule of production if necessary. All machine components are subjected to degradation processes in real environments and they have certain failure characteristics which can be related to the operating conditions. This paper describes a technique for accurate assessment of the remnant life of bearings based on health state probability estimation and historical knowledge embedded in the closed loop diagnostics and prognostics system. The technique uses the Support Vector Machine (SVM) classifier as a tool for estimating health state probability of machine degradation process to provide long term prediction. To validate the feasibility of the proposed model, real life fault historical data from bearings of High Pressure-Liquefied Natural Gas (HP-LNG) pumps were analysed and used to obtain the optimal prediction of remaining useful life (RUL). The results obtained were very encouraging and showed that the proposed prognosis system based on health state probability estimation has the potential to be used as an estimation tool for remnant life prediction in industrial machinery.
Resumo:
Knowledge-based development has become a new urban policy approach for the competitive cities of the global knowledge economy era. For those cities seeking a knowledge-based development, benchmarking is an essential prerequisite for informed and strategic vision and policy making to achieve a prosperous development. Nevertheless, benchmarked knowledge-based development performance analysis of global and emerging knowledge cities is an understudied area. This paper aims to contribute to the field by introducing the methodology of a novel performance assessment model—that is the Knowledge-Based Urban Development Assessment Model—and providing lessons from the application of the model in an international knowledge city performance analysis study. The assessment model puts renowned global and emerging knowledge cities—that are Birmingham, Boston, Brisbane, Helsinki, Istanbul, Manchester, Melbourne, San Francisco, Sydney, Toronto, and Vancouver—under the knowledge-based development microscope. The results of the analysis provide internationally benchmarked snapshot of the degree of achievements in various knowledge-based urban development performance areas of the investigated knowledge cities, and reveals insightful lessons on scrutinizing the global perspectives on knowledge-based development of cities.
Resumo:
In the 21st century, it has become apparent that ‘knowledge’ is a major factor of postmodern production (Yigitcanlar et al., 2007). Beyond this, in today’s rapidly globalizing world, knowledge, along with the social and technological settings, is seen as a key to secure economic prosperity and quality of life (Yigitcanlar et al., 2008a). However, limiting the benefits of a ‘knowledge-based development’ to only economic gains—and to a degree to social ones—is quite a narrow sighted view (Yigitcanlar et al., 2008b). Thus, the concept of ‘knowledge-based urban development’ is coined to bring economic prosperity, environmental sustainability, a just socio-spatial order and good governance to cities, and as a result producing a purposefully designed city—i.e., ‘knowledge city’—generating positive environmental and governance outcomes as well as economic and societal ones (Yigitcanlar, 2011; Carrillo et al., 2014).
Resumo:
In this paper we introduce a formalization of Logical Imaging applied to IR in terms of Quantum Theory through the use of an analogy between states of a quantum system and terms in text documents. Our formalization relies upon the Schrodinger Picture, creating an analogy between the dynamics of a physical system and the kinematics of probabilities generated by Logical Imaging. By using Quantum Theory, it is possible to model more precisely contextual information in a seamless and principled fashion within the Logical Imaging process. While further work is needed to empirically validate this, the foundations for doing so are provided.