856 resultados para Web-Centric Expert System


Relevância:

100.00% 100.00%

Publicador:

Resumo:

La present tesi pretén recollir l'experiència viscuda en desenvolupar un sistema supervisor intel·ligent per a la millora de la gestió de plantes depuradores d'aigües residuals., implementar-lo en planta real (EDAR Granollers) i avaluar-ne el funcionament dia a dia amb situacions típiques de la planta. Aquest sistema supervisor combina i integra eines de control clàssic de les plantes depuradores (controlador automàtic del nivell d'oxigen dissolt al reactor biològic, ús de models descriptius del procés...) amb l'aplicació d'eines del camp de la intel·ligència artificial (sistemes basats en el coneixement, concretament sistemes experts i sistemes basats en casos, i xarxes neuronals). Aquest document s'estructura en 9 capítols diferents. Hi ha una primera part introductòria on es fa una revisió de l'estat actual del control de les EDARs i s'explica el perquè de la complexitat de la gestió d'aquests processos (capítol 1). Aquest capítol introductori juntament amb el capítol 2, on es pretén explicar els antecedents d'aquesta tesi, serveixen per establir els objectius d'aquest treball (capítol 3). A continuació, el capítol 4 descriu les peculiaritats i especificitats de la planta que s'ha escollit per implementar el sistema supervisor. Els capítols 5 i 6 del present document exposen el treball fet per a desenvolupar el sistema basat en regles o sistema expert (capítol 6) i el sistema basat en casos (capítol 7). El capítol 8 descriu la integració d'aquestes dues eines de raonament en una arquitectura multi nivell distribuïda. Finalment, hi ha una darrer capítol que correspon a la avaluació (verificació i validació), en primer lloc, de cadascuna de les eines per separat i, posteriorment, del sistema global en front de situacions reals que es donin a la depuradora

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the design, implementation and testing of an intelligent knowledge-based supervisory control (IKBSC) system for a hot rolling mill process. A novel architecture is used to integrate an expert system with an existing supervisory control system and a new optimization methodology for scheduling the soaking pits in which the material is heated prior to rolling. The resulting IKBSC system was applied to an aluminium hot rolling mill process to improve the shape quality of low-gauge plate and to optimise the use of the soaking pits to reduce energy consumption. The results from the trials demonstrate the advantages to be gained from the IKBSC system that integrates knowledge contained within data, plant and human resources with existing model-based systems. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study is to develop a Pollution Early Warning System (PEWS) for efficient management of water quality in oyster harvesting areas. To that end, this paper presents a web-enabled, user-friendly PEWS for managing water quality in oyster harvesting areas along Louisiana Gulf Coast, USA. The PEWS consists of (1) an Integrated Space-Ground Sensing System (ISGSS) gathering data for environmental factors influencing water quality, (2) an Artificial Neural Network (ANN) model for predicting the level of fecal coliform bacteria, and (3) a web-enabled, user-friendly Geographic Information System (GIS) platform for issuing water pollution advisories and managing oyster harvesting waters. The ISGSS (data acquisition system) collects near real-time environmental data from various sources, including NASA MODIS Terra and Aqua satellites and in-situ sensing stations managed by the USGS and the NOAA. The ANN model is developed using the ANN program in MATLAB Toolbox. The ANN model involves a total of 6 independent environmental variables, including rainfall, tide, wind, salinity, temperature, and weather type along with 8 different combinations of the independent variables. The ANN model is constructed and tested using environmental and bacteriological data collected monthly from 2001 – 2011 by Louisiana Molluscan Shellfish Program at seven oyster harvesting areas in Louisiana Coast, USA. The ANN model is capable of explaining about 76% of variation in fecal coliform levels for model training data and 44% for independent data. The web-based GIS platform is developed using ArcView GIS and ArcIMS. The web-based GIS system can be employed for mapping fecal coliform levels, predicted by the ANN model, and potential risks of norovirus outbreaks in oyster harvesting waters. The PEWS is able to inform decision-makers of potential risks of fecal pollution and virus outbreak on a daily basis, greatly reducing the risk of contaminated oysters to human health.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a method of identifying morphological attributes that classify wear particles in relation to the wear process from which they originate and permit the automatic identification without human expertise. The method is based on the use of Multi Layer Perceptron (MLP) for analysis of specific types of microscopic wear particles. The classification of the wear particles was performed according to their morphological attributes of size and aspect ratio, among others. (C) 2010 Journal of Mechanical Engineering. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An extended version of HIER, a query-the-user facility for expert systems is presented. HIER was developed to run over Prolog programs, and has been incorporated to systems that support the design of large and complex applications. The framework of the extended version is described,; as well as the major features of the implementation. An example is included to illustrate the use of the tool, involving the design of a specific database application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The University of São Paulo has been experiencing the increase in contents in electronic and digital formats, distributed by different suppliers and hosted remotely or in clouds, and is faced with the also increasing difficulties related to facilitating access to this digital collection by its users besides coexisting with the traditional world of physical collections. A possible solution was identified in the new generation of systems called Web Scale Discovery, which allow better management, data integration and agility of search. Aiming to identify if and how such a system would meet the USP demand and expectation and, in case it does, to identify what the analysis criteria of such a tool would be, an analytical study with an essentially documental base was structured, as from a revision of the literature and from data available in official websites and of libraries using this kind of resources. The conceptual base of the study was defined after the identification of software assessment methods already available, generating a standard with 40 analysis criteria, from details on the unique access interface to information contents, web 2.0 characteristics, intuitive interface, facet navigation, among others. The details of the studies conducted into four of the major systems currently available in this software category are presented, providing subsidies for the decision-making of other libraries interested in such systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article presents a multi-agent expert system (SMAF) , that allows the input of incidents which occur in different elements of the telecommunications area. SMAF interacts with experts and general users, and each agent with all the agents? community, recording the incidents and their solutions in a knowledge base, without the analysis of their causes. The incidents are expressed using keywords taken from natural language (originally Spanish) and their main concepts are recorded with their severities as the users express them. Then, there is a search of the best solution for each incident, being helped by a human operator using a distancenotions between them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Expert systems are built from knowledge traditionally elicited from the human expert. It is precisely knowledge elicitation from the expert that is the bottleneck in expert system construction. On the other hand, a data mining system, which automatically extracts knowledge, needs expert guidance on the successive decisions to be made in each of the system phases. In this context, expert knowledge and data mining discovered knowledge can cooperate, maximizing their individual capabilities: data mining discovered knowledge can be used as a complementary source of knowledge for the expert system, whereas expert knowledge can be used to guide the data mining process. This article summarizes different examples of systems where there is cooperation between expert knowledge and data mining discovered knowledge and reports our experience of such cooperation gathered from a medical diagnosis project called Intelligent Interpretation of Isokinetics Data, which we developed. From that experience, a series of lessons were learned throughout project development. Some of these lessons are generally applicable and others pertain exclusively to certain project types.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work deals with quality level prediction in concrete structures through the helpful assistance of an expert system wich is able to apply reasoning to this field of structural engineering. Evidences, hypotheses and factors related to this human knowledge field have been codified into a Knowledge Base in terms of probabilities for the presence of either hypotheses or evidences,and conditional presence of both. Human experts in structural engineering and safety of structures gave their invaluable knowledge and assistance necessary when constructing the "computer knowledge body".

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El presente Trabajo de Fin de Grado se enmarca dentro del sistema web de la asignaturade Procesadores de Lenguajes perteneciente al departamento de Lenguajes y Sistemas Informáticos e Ingeniería de Software de la Escuela Técnica Superior de Ingenieros Informáticos de la Universidad Politécnica de Madrid. Este Trabajo consta de varias líneas de desarrollo, que se engloban dentro de dicho marco y surgen de la necesidad de mejorar el sistema para hacer que éste sea accesible a todo tipo de usuarios, y a la vez se mantenga actualizado según las tecnologías más recientes. En primer lugar, el presente Trabajo se centra en estudiar la accesibilidad de la web de la asignatura de Procesadores de Lenguajes siguiendo las Pautas de Accesibilidad al Contenido en la Web (Web Content Accessibility Guidelines, WCAG) en su segunda versión (2.0). Para ello, se ha llevado a cabo un informe detallado que recoge los resultados de este estudio sobre los criterios de aceptación de las WCAG, y posteriormente se han implementado los cambios necesarios para solucionar los criterios erróneos detectados. De esta manera se puede asegurar que la web es accesible para personas con distintos tipos de discapacidad. Así mismo, y siguiendo el criterio de conseguir una web más accesible, se ha adaptado el sistema a tecnologías más recientes. En el momento de empezar el Trabajo, el sistema web contaba con una serie de páginas estáticas (XHTML 1.1 + CSS 2.1) y una serie de páginas dinámicas (XHTML 1.1 + CSS 2.1 + PHP + MySQL). Estas páginas han sido actualizadas a sus versiones más recientes (HTML 5 y CSS 3). La web cuenta también con un sistema de creación de grupos de prácticas que facilita su gestión tanto a profesores como a alumnos, además de facilitar el alta de los estudiantes de la asignatura. El sistema posee además un módulo de administración para que el personal docente pueda gestionarlo. Sobre este sistema web implantado en la actualidad, se ha realizado una batería de pruebas para garantizar su correcto funcionamiento, y se han corregido todos los errores detectados durante dicho proceso. Al mismo tiempo, se han implementado nuevas funcionalidades que han ido surgiendo desde la creación del sistema hasta el momento presente. Por último, se ha desarrollado un sistema de avisos RSS que permite a los alumnos de la asignatura permanecer al corriente de los avisos y noticias publicados en el tablón de anuncios de la web. Este sistema de avisos RSS servirá también para otros sitios web del Centro que utilicen el tablón de avisos multipropósito y podrá ser visualizado tanto en inglés como en español. ---ABSTRACT---The present final year project is set within the framework of the subject “Procesadores de Lenguajes”, that belongs to the “Computer Languages and Systems and Software Engineering” department of the Escuela Técnica Superior de Ingenieros Informáticos of the Polytechnic University of Madrid. This study is divided in several angles of development that are included inside the abovementioned framework. They all emerge from the necessity of upgrading the system in order to make it accessible to everybody and the same time bringing it up to date to the latest technologies. First of all, it is focused on the study of the accessibility of the web site of the subject Procesadores de Lenguajes, following the second version of the Web Content Accessibility Guidelines (WCAG 2.0). In order to do this, an in-depth report containing the results of the study on the acceptance criteria of the WCAG has been developed. Right afterwards, necessary changes were implemented to correct the erroneous criteria detected. Similarly, and following the criteria of achieving a more accessible web site, the system has been adapted to updated technologies. At the start point, the web system consisted in a series of static pages (XHTML 1.1 + CSS 2.1) and a series of dynamic ones (XHTML 1.1 + CSS 2.1 + PHP + MySQL). These pages have been updated to their latest versions (HTML 5 and CSS 3). The web site has a system for the creation of working groups that makes their management easier, both for the teachers and for the students, as well as the registration process. The teaching staff can also manage the system through the administration module. Over the current web system, sets of several tests have taken place in order to guarantee its correct functioning and all the errors that appeared have been corrected. Likewise, new functionalities have been implemented, and those have been arising since the creation of the system till the present time. Finally, an RSS alert system has been developed, allowing students to keep updated on the news and alerts published in the website noticeboard. This RSS alert system will be shared with other websites of the School using the multipurpose noticeboard, and will be available both in Spanish and English.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the passage of the Federal-Aid Highway Act of 1956, the automobile has become the primary form of transportation on the Mississippi Gulf Coast. As the rate of motor vehicle use continues to rise faster than population growth, the benefits of the current transportation system are coming at a price that rivals annual household expenditures for housing. Furthermore, the automobile-centric transportation system incurs environmental costs. Carbon dioxide emissions, motor fuel use, health care costs for chronic illness, and the loss and impairment of natural resources due to sprawling development, continue to escalate. This project analyzes the environmental costs associated with automobile-centric planning for the urbanized area of the Mississippi Gulf Coast and compares these costs to those of alternative transportation modes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the relationship between the techniques used to build expert systems and the behaviors they exhibit to show that there is not sufficient evidence to link the behavioral shortcomings of first-generation expert systems to the shallow methods of representation and inference they employ. There is only evidence that the shortcomings are a consequence of a general lack of knowledge. Moreover, the paper shows that the first-generation of expert systems employ both shallow methods and most of the so-called deep methods. Lastly, we show that deeper methods augment but do not replace shallow reasoning methods; most expert systems should possess both."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Owing to the high degree of vulnerability of liquid retaining structures to corrosion problems, there are stringent requirements in its design against cracking. In this paper, a prototype knowledge-based system is developed and implemented for the design of liquid retaining structures based on the blackboard architecture. A commercially available expert system shell VISUAL RULE STUDIO working as an ActiveX Designer under the VISUAL BASIC programming environment is employed. Hybrid knowledge representation approach with production rules and procedural methods under object-oriented programming are used to represent the engineering heuristics and design knowledge of this domain. It is demonstrated that the blackboard architecture is capable of integrating different knowledge together in an effective manner. The system is tailored to give advice to users regarding preliminary design, loading specification and optimized configuration selection of this type of structure. An example of application is given to illustrate the capabilities of the prototype system in transferring knowledge on liquid retaining structure to novice engineers. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MULTIPRED is a web-based computational system for the prediction of peptide binding to multiple molecules ( proteins) belonging to human leukocyte antigens (HLA) class I A2, A3 and class II DR supertypes. It uses hidden Markov models and artificial neural network methods as predictive engines. A novel data representation method enables MULTIPRED to predict peptides that promiscuously bind multiple HLA alleles within one HLA supertype. Extensive testing was performed for validation of the prediction models. Testing results show that MULTIPRED is both sensitive and specific and it has good predictive ability ( area under the receiver operating characteristic curve A(ROC) > 0.80). MULTIPRED can be used for the mapping of promiscuous T-cell epitopes as well as the regions of high concentration of these targets termed T-cell epitope hotspots. MULTIPRED is available at http:// antigen.i2r.a-star.edu.sg/ multipred/.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Environmental Sciences Division within Queensland Environmental Protection Agency works to monitor, assess and model the condition of the environment. The Division has as a legislative responsibility to produce a whole-of-government report every four years dealing environmental conditions and trends in a ”State of the Environment report” (SoE)[1][2][3]. State of Environment Web Service Reporting System is a supplementary web service based SoE reporting tool, which aims to deliver accurate, timely and accessible information on the condition of the environment through web services via Internet [4][5]. This prototype provides a scientific assessment of environmental conditions for a set of environmental indicators. It contains text descriptions and tables, charts and maps with spatiotemporal dimensions to show the impact of certain environmental indicators on our environment. This prototype is a template based indicator system, to which the administrator may add new sql queries for new indicator services without changing the architecture and codes of this template. The benefits are brought through a service-oriented architecture which provides an online query service with seamless integration. In addition, since it uses web service architecture, each individual component within the application can be implemented by using different programming languages and in different operating systems. Although the services showed in this demo are built upon two datasets of regional ecosystem and protection area of Queensland, it will be possible to report on the condition of water, air, land, coastal zones, energy resources, biodiversity, human settlements and natural culture heritage on the fly as well. Figure 1 shows the architecture of the prototype. In the next section, I will discuss the research tasks in the prototype.