944 resultados para Expert system


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This research aimed to develop a Fuzzy inference based on expert system to help preventing lameness in dairy cattle. Hoof length, nutritional parameters and floor material properties (roughness) were used to build the Fuzzy inference system. The expert system architecture was defined using Unified Modelling Language (UML). Data were collected in a commercial dairy herd using two different subgroups (H-1 and H-2), in order to validate the Fuzzy inference functions. The numbers of True Positive (TP), False Positive (FP), True Negative (TN), and False Negative (FN) responses were used to build the classifier system up, after an established gold standard comparison. A Lesion Incidence Possibility (LIP) developed function indicates the chances of a cow becoming lame. The obtained lameness percentage in H-1 and H-2 was 8.40% and 1.77%, respectively. The system estimated a Lesion Incidence Possibility (LIP) of 5.00% and 2.00% in H-1 and H-2, respectively. The system simulation presented 3.40% difference from real cattle lameness data for H-1, while for H-2, it was 0.23%; indicating the system efficiency in decision-making.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work describes the creation of heuristics rules based on 13C-NMR spectroscopy that characterize several skeletal types of diterpenes. Using a collection of 2745 spectra we built a database linked to the expert system SISTEMAT. Several programs were applied to the database in order to discover characteristic signals that identify with a good performance, a large diversity of skeletal types. The heuristic approach used was able to differentiate groups of skeletons based firstly on the number of primary, secondary, tertiary and quaternary carbons, and secondly the program searches, for each group, if there are ranges of chemical shifts that identifies specific skeletal type. The program was checked with 100 new structures recently published and was able to identify the correct skeleton in 65 of the studied cases. When the skeleton has several hundreds of compounds, for example, the labdanes, the program employs the concept of subskeletal, and does not classify in the same group labdanes with double bounds at different positions. The chemical shift ranges for each subskeletal types and the structures of all skeletal types are given. The consultation program can be obtained from the authors. © 1997 - IOS Press. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE: This study proposes a new approach that considers uncertainty in predicting and quantifying the presence and severity of diabetic peripheral neuropathy. METHODS: A rule-based fuzzy expert system was designed by four experts in diabetic neuropathy. The model variables were used to classify neuropathy in diabetic patients, defining it as mild, moderate, or severe. System performance was evaluated by means of the Kappa agreement measure, comparing the results of the model with those generated by the experts in an assessment of 50 patients. Accuracy was evaluated by an ROC curve analysis obtained based on 50 other cases; the results of those clinical assessments were considered to be the gold standard. RESULTS: According to the Kappa analysis, the model was in moderate agreement with expert opinions. The ROC analysis (evaluation of accuracy) determined an area under the curve equal to 0.91, demonstrating very good consistency in classifying patients with diabetic neuropathy. CONCLUSION: The model efficiently classified diabetic patients with different degrees of neuropathy severity. In addition, the model provides a way to quantify diabetic neuropathy severity and allows a more accurate patient condition assessment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

L'approvvigionamento di risorse minerali e la tutela dell'ambiente sono spesso considerate attività contrapposte ed inconciliabili, ma in realtà rappresentano due necessità imprescindibili per le società moderne. Le georisorse, in quanto non rinnovabili, devono essere valorizzate in maniera efficiente, adoperando strumenti che garantiscano la sostenibilità ambientale, sociale ed economica degli interventi estrattivi. La necessità di tutelare il territorio e migliorare la qualità della vita delle comunità locali impone alla Pubblica Amministrazione di implementare misure per la riqualificazione di aree degradate, ma fino ai primi anni '90 la normativa di settore non prevedeva strumenti a tal proposito, e ciò ha portato alla proliferazione di siti estrattivi dismessi e abbandonati senza interventi di recupero ambientale. Il presente lavoro di ricerca fornisce contributi innovativi alla pianificazione e progettazione sostenibile delle attività estrattive, attraverso l'adozione di un approccio multidisciplinare alla trattazione del tema e l'utilizzo esperto dei Sistemi Informativi Geografici, in particolare GRASS GIS. A seguito di una approfondita analisi in merito agli strumenti e le procedure adottate nella pianificazione delle Attività Estrattive in Italia, sono stati sviluppati un metodo di indagine ed un sistema esperto per la previsione ed il controllo delle vibrazioni indotte nel terreno da volate in cava a cielo aperto, che consentono di ottimizzare la progettazione della volata e del sistema di monitoraggio delle vibrazioni grazie a specifici strumenti operativi implementati in GRASS GIS. A supporto di una più efficace programmazione di interventi di riqualificazione territoriale, è stata messa a punto una procedura per la selezione di siti dismessi e di potenziali interventi di riqualificazione, che ottimizza le attività di pianificazione individuando interventi caratterizzati da elevata sostenibilità ambientale, economica e sociale. I risultati ottenuti dimostrano la necessità di un approccio esperto alla pianificazione ed alla progettazione delle attività estrattive, incrementandone la sostenibilità attraverso l'adozione di strumenti operativi più efficienti.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La tesi è stata incentrata sul gioco «Indovina chi?» per l’identificazione da parte del robot Nao di un personaggio tramite la sua descrizione. In particolare la descrizione avviene tramite domande e risposte L’obiettivo della tesi è la progettazione di un sistema in grado di capire ed elaborare dei dati comunicati usando un sottoinsieme del linguaggio naturale, estrapolarne le informazioni chiave e ottenere un riscontro con informazioni date in precedenza. Si è quindi programmato il robot Nao in modo che sia in grado di giocare una partita di «Indovina chi?» contro un umano comunicando tramite il linguaggio naturale. Sono state implementate regole di estrazione e categorizzazione per la comprensione del testo utilizzando Cogito, una tecnologia brevettata dall'azienda Expert System. In questo modo il robot è in grado di capire le risposte e rispondere alle domande formulate dall'umano mediante il linguaggio naturale. Per il riconoscimento vocale è stata utilizzata l'API di Google e PyAudio per l'utilizzo del microfono. Il programma è stato implementato in Python e i dati dei personaggi sono memorizzati in un database che viene interrogato e modificato dal robot. L'algoritmo del gioco si basa su calcoli probabilistici di vittoria del robot e sulla scelta delle domande da proporre in base alle risposte precedentemente ricevute dall'umano. Le regole semantiche realizzate danno la possibilità al giocatore di formulare frasi utilizzando il linguaggio naturale, inoltre il robot è in grado di distinguere le informazioni che riguardano il personaggio da indovinare senza farsi ingannare. La percentuale di vittoria del robot ottenuta giocando 20 partite è stata del 50%. Il data base è stato sviluppato in modo da poter realizzare un identikit completo di una persona, oltre a quello dei personaggi del gioco. È quindi possibile ampliare il progetto per altri scopi, oltre a quello del gioco, nel campo dell'identificazione.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVE: Sleep disturbances are prevalent but often overlooked or underestimated. We suspected that sleep disorders might be particularly common among pharmacy customers, and that they could benefit from counselling. Therefore, we described the prevalence and severity of symptoms associated with sleep and wakefulness disorders among Swiss pharmacy customers, and estimated the need for counselling and treatment. METHODS: In 804 Swiss pharmacies (49% of all community pharmacies) clients were invited to complete the Stanford Sleep Disorders Questionnaire (SDQ), and the Epworth Sleepiness Scale (EPW). The SDQ was designed to classify symptoms of sleep and wakefulness into the four most prevalent disorders: sleep apnoea syndrome (SAS), insomnia in psychiatric disorders (PSY), periodic leg movement disorders/restless legs (RLS) and narcolepsy (NAR). Data were entered into an internet-linked database for analysis by an expert system as a basis for immediate counselling by the pharmacist. RESULTS: Of 4901 participants, 3238 (66.1%) were female, and 1663 (33.9%) were male. The mean age (SD) of females and males was 52.4 (18.05), and 55.1 (17.10) years, respectively. The percentages of female and male individuals above cut-off of SDQ subscales were 11.4% and 19.8% for sleep apnoea, 40.9% and 38.7% for psychiatric sleep disorders, 59.3% and 46.8% for restless legs, and 10.4% and 9.4% for narcolepsy respectively. The prevalence of an Epworth Sleepiness Scale score >11 was 16.5% in females, and 23.9% in males. Reliability assessed by Cronbach's alpha was 0.65 to 0.78 for SDQ subscales, and for the Epworth score. CONCLUSIONS: Symptoms of sleep and wakefulness disorders among Swiss pharmacy customers were highly prevalent. The SDQ and the Epworth Sleepiness Scale score had a satisfactory reliability to be useful for identification of pharmacy customers who might benefit from information and counselling while visiting pharmacies. The internet-based system proved to be a helpful tool for the pharmacist when counselling his customers in terms of diagnostic classification and severity of symptoms associated with the sleeping and waking state.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

International trade with horses is important and continuously increasing. Therefore the risk of spread of infectious diseases is permanently present. Within this context the worldwide situation of equine vector-borne diseases and of other diseases which are notifiable to the World Organisation of Animal Health (OIE), is described. Furthermore it provides estimates of the numbers of horse movements between these countries, as well as information on import requirements and preventive measures for reducing the risk of disease spread. According to TRACES (Trade Control and Expert System of the European Union) data from 2009 and 2010 81 horses per week were imported from North America into Europe, 42 horses per week from South America, 11 horses per week from the North of Africa and the African horse sichness free-zone of South Africa, 28 per week from the Middle East and the rest of Asia and approximately 4 horses per week from Australia / Oceania. Trade within the European Union resulted amongst others in the introduction of Equine Infectious Anaemia (EIA) from Roma- nia into other European countries. Another example is the suspected case of glanders which occurred after importation of horses from Leb- anon via France and Germany into Switzerland in July 2011.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Academic and industrial research in the late 90s have brought about an exponential explosion of DNA sequence data. Automated expert systems are being created to help biologists to extract patterns, trends and links from this ever-deepening ocean of information. Two such systems aimed on retrieving and subsequently utilizing phylogenetically relevant information have been developed in this dissertation, the major objective of which was to automate the often difficult and confusing phylogenetic reconstruction process. ^ Popular phylogenetic reconstruction methods, such as distance-based methods, attempt to find an optimal tree topology (that reflects the relationships among related sequences and their evolutionary history) by searching through the topology space. Various compromises between the fast (but incomplete) and exhaustive (but computationally prohibitive) search heuristics have been suggested. An intelligent compromise algorithm that relies on a flexible “beam” search principle from the Artificial Intelligence domain and uses the pre-computed local topology reliability information to adjust the beam search space continuously is described in the second chapter of this dissertation. ^ However, sometimes even a (virtually) complete distance-based method is inferior to the significantly more elaborate (and computationally expensive) maximum likelihood (ML) method. In fact, depending on the nature of the sequence data in question either method might prove to be superior. Therefore, it is difficult (even for an expert) to tell a priori which phylogenetic reconstruction method—distance-based, ML or maybe maximum parsimony (MP)—should be chosen for any particular data set. ^ A number of factors, often hidden, influence the performance of a method. For example, it is generally understood that for a phylogenetically “difficult” data set more sophisticated methods (e.g., ML) tend to be more effective and thus should be chosen. However, it is the interplay of many factors that one needs to consider in order to avoid choosing an inferior method (potentially a costly mistake, both in terms of computational expenses and in terms of reconstruction accuracy.) ^ Chapter III of this dissertation details a phylogenetic reconstruction expert system that selects a superior proper method automatically. It uses a classifier (a Decision Tree-inducing algorithm) to map a new data set to the proper phylogenetic reconstruction method. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work is part of the project CAMEVA for the development of an expert system aimed at the automatic identification of ores [1, 2]. It relies on the measure of their reflectance values, R, on digital images. Software for calibration, acquisition and analysis of the multispectral data was designed by AITEMIN [3]; the research was also assessed by H.J. Bernhardt and E. Pirard [1].

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Traditional identification of ore minerals with reflected light microscopy relies heavily on the experience of the observer. Qualified observers have become a rarity, as ore microscopy is often neglected in today’s university training, but since it furnishes necessary and inexpensive information, innovative alternatives are needed, especially for quantification. Many of the diagnostic optical properties of ores defy quantification, but recent developments in electronics and optics allow new insights into the reflectance and colour properties of ores. Preliminary results for the development of an expert system aimed at the automatic identification of ores based on their reflectance properties are presented. The discriminatory capacity of the system is enhanced by near IR reflectance measures, while UV filters tested to date are unreliable. Interaction with image analysis software through a wholly automated microscope, to furnish quantitative and morphological information for geometallurgy, relies on automated identification of the ores based on the measured spectra. This methodology increases enormously the performance of the microscopist; nevertheless supervision by an expert is always needed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper describes a novel method to enhance current airport surveillance systems used in Advanced Surveillance Monitoring Guidance and Control Systems (A-SMGCS). The proposed method allows for the automatic calibration of measurement models and enhanced detection of nonideal situations, increasing surveillance products integrity. It is based on the definition of a set of observables from the surveillance processing chain and a rule based expert system aimed to change the data processing methods

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we present the use of D-higraphs to perform HAZOP studies. D-higraphs is a formalism that includes in a single model the functional as well as the structural (ontological) components of any given system. A tool to perform a semi-automatic guided HAZOP study on a process plant is presented. The diagnostic system uses an expert system to predict the behavior modeled using D-higraphs. This work is applied to the study of an industrial case and its results are compared with other similar approaches proposed in previous studies. The analysis shows that the proposed methodology fits its purpose enabling causal reasoning that explains causes and consequences derived from deviations, it also fills some of the gaps and drawbacks existing in previous reported HAZOP assistant tools.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La diabetes mellitus es un trastorno del metabolismo de los carbohidratos producido por la insuficiente o nula producción de insulina o la reducida sensibilidad a esta hormona. Es una enfermedad crónica con una mayor prevalencia en los países desarrollados debido principalmente a la obesidad, la vida sedentaria y disfunciones en el sistema endocrino relacionado con el páncreas. La diabetes Tipo 1 es una enfermedad autoinmune en la que son destruidas las células beta del páncreas, que producen la insulina, y es necesaria la administración de insulina exógena. Un enfermo de diabetes Tipo 1 debe seguir una terapia con insulina administrada por la vía subcutánea que debe estar adaptada a sus necesidades metabólicas y a sus hábitos de vida, esta terapia intenta imitar el perfil insulínico de un páncreas no patológico. La tecnología actual permite abordar el desarrollo del denominado “páncreas endocrino artificial”, que aportaría precisión, eficacia y seguridad para los pacientes, en cuanto a la normalización del control glucémico y reducción del riesgo de hipoglucemias. Permitiría que el paciente no estuviera tan pendiente de su enfermedad. El páncreas artificial consta de un sensor continuo de glucosa, una bomba de infusión de insulina y un algoritmo de control, que calcula la insulina a infusionar usando la glucosa como información principal. Este trabajo presenta un método de control en lazo semi-cerrado mediante un sistema borroso experto basado en reglas. La regulación borrosa se fundamenta en la ambigüedad del lenguaje del ser humano. Esta incertidumbre sirve para la formación de una serie de reglas que representan el pensamiento humano, pero a la vez es el sistema que controla un proceso, en este caso el sistema glucorregulatorio. Este proyecto está enfocado en el diseño de un controlador borroso que haciendo uso de variables como la glucosa, insulina y dieta, sea capaz de restaurar la función endocrina del páncreas de forma tecnológica. La validación del algoritmo se ha realizado principalmente mediante experimentos en simulación utilizando una población de pacientes sintéticos, evaluando los resultados con estadísticos de primer orden y algunos más específicos como el índice de riesgo de Kovatchev, para después comparar estos resultados con los obtenidos por otros métodos de control anteriores. Los resultados demuestran que el control borroso (FBPC) mejora el control glucémico con respecto a un sistema predictivo experto basado en reglas booleanas (pBRES). El FBPC consigue reducir siempre la glucosa máxima y aumentar la mínima respecto del pBRES pero es en terapias desajustadas, donde el FBPC es especialmente robusto, hace descender la glucosa máxima 8,64 mg/dl, el uso de insulina es 3,92 UI menor, aumenta la glucosa mínima 3,32 mg/dl y lleva al rango de glucosa 80 – 110 mg/dl 15,33 muestras más. Por lo tanto se puede concluir que el FBPC realiza un mejor control glucémico que el controlador pBRES haciéndole especialmente efectivo, robusto y seguro en condiciones de desajustes de terapia basal y con gran capacidad de mejora futura. SUMMARY The diabetes mellitus is a metabolic disorder caused by a poor or null insulin secretion or a reduced sensibility to insulin. Diabetes is a chronic disease with a higher prevalence in the industrialized countries, mainly due to obesity, the sedentary life and endocrine disfunctions connected with the pancreas. Type 1 diabetes is a self-immune disease where the beta cells of the pancreas, which are the responsible of secreting insulin, are damaged. Hence, it is necessary an exogenous delivery of insulin. The Type 1 diabetic patient has to follow a therapy with subcutaneous insulin administration which should be adjusted to his/her metabolic needs and life style. This therapy tries to mimic the insulin profile of a non-pathological pancreas. Current technology lets the development of the so-called endocrine artificial pancreas that would provide accuracy, efficiency and safety to patients, in regards to the glycemic control normalization and reduction of the risk of hypoglycemic. In addition, it would help the patient not to be so concerned about his disease. The artificial pancreas has a continuous glucose sensor, an insulin infusion pump and a control algorithm, that calculates the insulin infusion using the glucose as main information. This project presents a method of control in semi-closed-loop, through an expert fuzzy system based on rules. The fuzzy regulation is based on the human language ambiguity. This uncertainty serves for construction of some rules that represent the human language besides it is the system that controls a process, in this case the glucoregulatory system. This project is focus on the design of a fuzzy controller that, using variables like glucose insulin and diet, will be able to restore the pancreas endocrine function with technology. The algorithm assessment has mainly been done through experiments in simulation using a population of synthetic patients, evaluating the results with first order statistical parameters and some other more specific such as the Kovatchev risk index, to compare later these results with the ones obtained in others previous methods of control. The results demonstrate that the fuzzy control (FBPC) improves the glycemic control connected with a predictive expert system based on Booleans rules (pBRES). The FBPC is always able to reduce the maximum level of glucose and increase the minimum level as compared with pBRES but it is in unadjusted therapies where FBPC is especially strong, it manages to decrease the maximum level of glucose and insulin used by 8,64 mg/dl and 3,92 UI respectively, also increases the value of minimum glucose by 3,32 mg/dl, getting 15,33 samples more inside the 80-110 mg/dl glucose rank. Therefore we can conclude that FBPC achieves a better glycemic control than the controller pBRES doing it especially effective, robust and safe in conditions of mismatch basal therapy and with a great capacity for future improvements.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La capacidad de transporte es uno de los baremos fundamentales para evaluar la progresión que puede llegar a tener un área económica y social. Es un sector de elevada importancia para la sociedad actual. Englobado en los distintos tipos de transporte, uno de los medios de transporte que se encuentra más en alza en la actualidad, es el ferroviario. Tanto para movilidad de pasajeros como para mercancías, el tren se ha convertido en un medio de transporte muy útil. Se encuentra dentro de las ciudades, entre ciudades con un radio pequeño entre ellas e incluso cada vez más, gracias a la alta velocidad, entre ciudades con gran distancia entre ellas. Esta Tesis pretende ayudar en el diseño de una de las etapas más importantes de los Proyectos de instalación de un sistema ferroviario: el sistema eléctrico de tracción. La fase de diseño de un sistema eléctrico de tracción ferroviaria se enfrenta a muchas dudas que deben ser resueltas con precisión. Del éxito de esta fase dependerá la capacidad de afrontar las demandas de energía de la explotación ferroviaria. También se debe atender a los costes de instalación y de operación, tanto costes directos como indirectos. Con la Metodología que se presenta en esta Tesis se ofrecerá al diseñador la opción de manejar un sistema experto que como soluciones le plantee un conjunto de escenarios de sistemas eléctricos correctos, comprobados por resolución de modelos de ecuaciones. Correctos desde el punto de vista de validez de distintos parámetros eléctrico, como de costes presupuestarios e impacto de costes indirectos. Por tanto, el diseñador al haber hecho uso de esta Metodología, tendría en un espacio de tiempo relativamente corto, un conjunto de soluciones factibles con las que poder elegir cuál convendría más según sus intereses finales. Esta Tesis se ha desarrollado en una vía de investigación integrada dentro del Centro de Investigaciones Ferroviarias CITEF-UPM. Entre otros proyectos y vías de investigación, en CITEF se ha venido trabajando en estudios de validación y dimensionamiento de sistemas eléctricos ferroviarios con diversos y variados clientes y sistemas ferroviarios. A lo largo de los proyectos realizados, el interés siempre ha girado mayoritariamente sobre los siguientes parámetros del sistema eléctrico: - Calcular número y posición de subestaciones de tracción. Potencia de cada subestación. - Tipo de catenaria a lo largo del recorrido. Conductores que componen la catenaria. Características. - Calcular número y posición de autotransformadores para sistemas funcionando en alterna bitensión o 2x25kV. - Posición Zonas Neutras. - Validación según normativa de: o Caídas de tensión en la línea o Tensiones máximas en el retorno de la línea o Sobrecalentamiento de conductores o Sobrecalentamiento de los transformadores de las subestaciones de tracción La idea es que las soluciones aportadas por la Metodología sugieran escenarios donde de estos parámetros estén dentro de los límites que marca la normativa. Tener la posibilidad de tener un repositorio de posibles escenarios donde los parámetros y elementos eléctricos estén calculados como correctos, aporta un avance en tiempos y en pruebas, que mejoraría ostensiblemente el proceso habitual de diseño para los sistemas eléctricos ferroviarios. Los costes directos referidos a elementos como subestaciones de tracción, autotransformadores, zonas neutras, ocupan un gran volumen dentro del presupuesto de un sistema ferroviario. En esta Tesis se ha querido profundizar también en el efecto de los costes indirectos provocados en la instalación y operación de sistemas eléctricos. Aquellos derivados del impacto medioambiental, los costes que se generan al mantener los equipos eléctricos y la instalación de la catenaria, los costes que implican la conexión entre las subestaciones de tracción con la red general o de distribución y por último, los costes de instalación propios de cada elemento compondrían los costes indirectos que, según experiencia, se han pensado relevantes para ejercer un cierto control sobre ellos. La Metodología cubrirá la posibilidad de que los diseños eléctricos propuestos tengan en cuenta variaciones de coste inasumibles o directamente, proponer en igualdad de condiciones de parámetros eléctricos, los más baratos en función de los costes comentados. Analizando los costes directos e indirectos, se ha pensado dividir su impacto entre los que se computan en la instalación y los que suceden posteriormente, durante la operación de la línea ferroviaria. Estos costes normalmente suelen ser contrapuestos, cuánto mejor es uno peor suele ser el otro y viceversa, por lo que hace falta un sistema que trate ambos objetivos por separado. Para conseguir los objetivos comentados, se ha construido la Metodología sobre tres pilares básicos: - Simulador ferroviario Hamlet: Este simulador integra módulos para construir esquemas de vías ferroviarios completos; módulo de simulación mecánica y de la tracción de material rodante; módulo de señalización ferroviaria; módulo de sistema eléctrico. Software realizado en C++ y Matlab. - Análisis y estudio de cómo focalizar los distintos posibles escenarios eléctricos, para que puedan ser examinados rápidamente. Pico de demanda máxima de potencia por el tráfico ferroviario. - Algoritmos de optimización: A partir de un estudio de los posibles algoritmos adaptables a un sistema tan complejo como el que se plantea, se decidió que los algoritmos genéticos serían los elegidos. Se han escogido 3 algoritmos genéticos, permitiendo recabar información acerca del comportamiento y resultados de cada uno de ellos. Los elegidos por motivos de tiempos de respuesta, multiobjetividad, facilidad de adaptación y buena y amplia aplicación en proyectos de ingeniería fueron: NSGA-II, AMGA-II y ɛ-MOEA. - Diseño de funciones y modelo preparado para trabajar con los costes directos e indirectos y las restricciones básicas que los escenarios eléctricos no deberían violar. Estas restricciones vigilan el comportamiento eléctrico y la estabilidad presupuestaria. Las pruebas realizadas utilizando el sistema han tratado o bien de copiar situaciones que se puedan dar en la realidad o directamente sistemas y problemas reales. Esto ha proporcionado además de la posibilidad de validar la Metodología, también se ha posibilitado la comparación entre los algoritmos genéticos, comparar sistemas eléctricos escogidos con los reales y llegar a conclusiones muy satisfactorias. La Metodología sugiere una vía de trabajo muy interesante, tanto por los resultados ya obtenidos como por las oportunidades que puede llegar a crear con la evolución de la misma. Esta Tesis se ha desarrollado con esta idea, por lo que se espera pueda servir como otro factor para trabajar con la validación y diseño de sistemas eléctricos ferroviarios. ABSTRACT Transport capacity is one of the critical points to evaluate the progress than a specific social and economical area is able to reach. This is a sector of high significance for the actual society. Included inside the most common types of transport, one of the means of transport which is elevating its use nowadays is the railway. Such as for passenger transport of weight movements, the train is being consolidated like a very useful mean of transport. Railways are installed in many geography areas. Everyone know train in cities, or connecting cities inside a surrounding area or even more often, taking into account the high-speed, there are railways infrastructure between cities separated with a long distance. This Ph.D work aims to help in the process to design one of the most essential steps in Installation Projects belonging to a railway system: Power Supply System. Design step of the railway power supply, usually confronts to several doubts and uncertainties, which must be solved with high accuracy. Capacity to supply power to the railway traffic depends on the success of this step. On the other hand is very important to manage the direct and indirect costs derived from Installation and Operation. With the Methodology is presented in this Thesis, it will be offered to the designer the possibility to handle an expert system that finally will fill a set of possible solutions. These solutions must be ready to work properly in the railway system, and they were tested using complex equation models. This Thesis has been developed through a research way, integrated inside Citef (Railway Research Centre of Technical University of Madrid). Among other projects and research ways, in Citef has been working in several validation studies and dimensioning of railway power supplies. It is been working by a large range of clients and railways systems. Along the accomplished Projects, the main goal has been rounded mostly about the next list of parameters of the electrical system: - Calculating number and location of traction substations. Power of each substation. - Type of Overhead contact line or catenary through the railway line. The wires which set up the catenary. Main Characteristics. - Calculating number and position of autotransformers for systems working in alternating current bi-voltage of called 2x25 kV. - Location of Neutral Zones. - Validating upon regulation of: o Drop voltages along the line o Maximum return voltages in the line o Overheating/overcurrent of the wires of the catenary o Avoiding overheating in the transformers of the traction substations. Main objective is that the solutions given by the Methodology, could be suggest scenarios where all of these parameters from above, would be between the limits established in the regulation. Having the choice to achieve a repository of possible good scenarios, where the parameters and electrical elements will be assigned like ready to work, that gives a great advance in terms of times and avoiding several tests. All of this would improve evidently the regular railway electrical systems process design. Direct costs referred to elements like traction substations, autotransformers, neutral zones, usually take up a great volume inside the general budget in railway systems. In this Thesis has been thought to bear in mind another kind of costs related to railway systems, also called indirect costs. These could be enveloped by those enmarked during installation and operation of electrical systems. Those derived from environmental impact; costs generated during the maintenance of the electrical elements and catenary; costs involved in the connection between traction substations and general electric grid; finally costs linked with the own installation of the whole electrical elements needed for the correct performance of the railway system. These are integrated inside the set has been collected taking into account own experience and research works. They are relevant to be controlled for our Methodology, just in case for the designers of this type of systems. The Methodology will cover the possibility that the final proposed power supply systems will be hold non-acceptable variations of costs, comparing with initial expected budgets, or directly assuming a threshold of budget for electrical elements in actual scenario, and achieving the cheapest in terms of commented costs from above. Analyzing direct and indirect costs, has been thought to divide their impact between two main categories. First one will be inside the Installation and the other category will comply with the costs often happens during Railway Operation time. These costs normally are opposed, that means when one is better the other turn into worse, in costs meaning. For this reason is necessary treating both objectives separately, in order to evaluate correctly the impact of each one into the final system. The objectives detailed before build the Methodology under three basic pillars: - Railway simulator Hamlet: This software has modules to configure many railway type of lines; mechanical and traction module to simulate the movement of rolling stock; signaling module; power supply module. This software has been developed using C++ and Matlab R13a - Previously has been mandatory to study how would be possible to work properly with a great number of feasible electrical systems. The target comprised the quick examination of these set of scenarios in terms of time. This point is talking about Maximum power demand peaks by railway operation plans. - Optimization algorithms. A railway infrastructure is a very complex system. At the beginning it was necessary to search about techniques and optimization algorithms, which could be adaptable to this complex system. Finally three genetic multiobjective algorithms were the chosen. Final decision was taken attending to reasons such as time complexity, able to multiobjective, easy to integrate in our problem and with a large application in engineering tasks. They are: NSGA-II, AMGA-II and ɛ-MOEA. - Designing objectives functions and equation model ready to work with the direct and indirect costs. The basic restrictions are not able to avoid, like budgetary or electrical, connected hardly with the recommended performance of elements, catenary and safety in a electrical railway systems. The battery of tests launched to the Methodology has been designed to be as real as possible. In fact, due to our work in Citef and with real Projects, has been integrated and configured three real railway lines, in order to evaluate correctly the final results collected by the Methodology. Another topic of our tests has been the comparison between the performances of the three algorithms chosen. Final step has been the comparison again with different possible good solutions, it means power supply system designs, provided by the Methodology, testing the validity of them. Once this work has been finished, the conclusions have been very satisfactory. Therefore this Thesis suggest a very interesting way of research and work, in terms of the results obtained and for the future opportunities can be created with the evolution of this. This Thesis has been developed with this idea in mind, so is expected this work could adhere another factor to work in the difficult task of validation and design of railway power supply systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The products and services designed for Smart Cities provide the necessary tools to improve the management of modern cities in a more efficient way. These tools need to gather citizens’ information about their activity, preferences, habits, etc. opening up the possibility of tracking them. Thus, privacy and security policies must be developed in order to satisfy and manage the legislative heterogeneity surrounding the services provided and comply with the laws of the country where they are provided. This paper presents one of the possible solutions to manage this heterogeneity, bearing in mind these types of networks, such as Wireless Sensor Networks, have important resource limitations. A knowledge and ontology management system is proposed to facilitate the collaboration between the business, legal and technological areas. This will ease the implementation of adequate specific security and privacy policies for a given service. All these security and privacy policies are based on the information provided by the deployed platforms and by expert system processing.