843 resultados para pacs: expert systems and other ai software and techniques
Resumo:
When designing a new passenger ship or modifying an existing design, how do we ensure that the proposed design and crew emergency procedures are safe from an evacuation resulting from fire or other incident? In the wake of major maritime disasters such as the Scandinavian Star, Herald of Free Enterprise, Estonia and in light of the growth in the numbers of high density high-speed ferries and large capacity cruise ships, issues concerning the evacuation of passengers and crew at sea are receiving renewed interest. Fire and evacuation models with features such as the ability to realistically simulate the spread of fire and fire suppression systems and the human response to fire sas well as the capability to model human performance in heeled orientations linked to a virtual reality environment that produces realistic visualisations of modelled scenarios are now available and can be used to aid the engineer in assessing ship design and procedures. This paper describes the maritmeEXODUS ship evacuation and the SMARTFIRE fire simulation model and provides an example application demonstrating the use of the models in performing fire and evacuation analysis for a large passenger ship partially based on the requirements of MSC circular 1033. The fire simulations include the action of a water mist system.
Resumo:
The needs for various forms of information systems relating to the European environment and ecosystem are reviewed, and limitations indicated. Existing information systems are reviewed and compared in terms of aims and functionalities. We consider TWO technical challenges involved in attempting to develop an IEEICS. First, there is the challenge of developing an Internet-based communication system which allows fluent access to information stored in a range of distributed databases. Some of the currently available solutions are considered, i.e. Web service federations. The second main challenge arises from the fact that there is general intra-national heterogeneity in the definitions adopted, and the measurement systems used throughout the nations of Europe. Integrated strategies are needed.
Resumo:
This work proceeds from the assumption that a European environmental information and communication system (EEICS) is already established. In the context of primary users (land-use planners, conservationists, and environmental researchers) we ask what use may be made of the EEICS for building models and tools which is of use in building decision support systems for the land-use planner. The complex task facing the next generation of environmental and forest modellers is described, and a range of relevant modelling approaches are reviewed. These include visualization and GIS; statistical tabulation and database SQL, MDA and OLAP methods. The major problem of noncomparability of the definitions and measures of forest area and timber volume is introduced and the possibility of a model-based solution is considered. The possibility of using an ambitious and challenging biogeochemical modelling approach to understanding and managing European forests sustainably is discussed. It is emphasised that all modern methodological disciplines must be brought to bear, and a heuristic hybrid modelling approach should be used so as to ensure that the benefits of practical empirical modelling approaches are utilised in addition to the scientifically well-founded and holistic ecosystem and environmental modelling. The data and information system required is likely to end up as a grid-based-framework because of the heavy use of computationally intensive model-based facilities.
Resumo:
Dealing with uncertainty problems in intelligent systems has attracted a lot of attention in the AI community. Quite a few techniques have been proposed. Among them, the Dempster-Shafer theory of evidence (DS theory) has been widely appreciated. In DS theory, Dempster's combination rule plays a major role. However, it has been pointed out that the application domains of the rule are rather limited and the application of the theory sometimes gives unexpected results. We have previously explored the problem with Dempster's combination rule and proposed an alternative combination mechanism in generalized incidence calculus. In this paper we give a comprehensive comparison between generalized incidence calculus and the Dempster-Shafer theory of evidence. We first prove that these two theories have the same ability in representing evidence and combining DS-independent evidence. We then show that the new approach can deal with some dependent situations while Dempster's combination rule cannot. Various examples in the paper show the ways of using generalized incidence calculus in expert systems.
Resumo:
Changes to software requirements occur during initial development and subsequent to delivery, posing a risk to cost and quality while at the same time providing an opportunity to add value. Provision of a generic change source taxonomy will support requirements change risk visibility, and also facilitate richer recording of both pre- and post-delivery change data. In this paper we present a collaborative study to investigate and classify sources of requirements change, drawing comparison between those pertaining to software development and maintenance. We begin by combining evolution, maintenance and software lifecycle research to derive a definition of software maintenance, which provides the foundation for empirical context and comparison. Previously published change ‘causes’ pertaining to development are elicited from the literature, consolidated using expert knowledge and classified using card sorting. A second study incorporating causes of requirements change during software maintenance results in a taxonomy which accounts for the entire evolutionary progress of applications software. We conclude that the distinction between the terms maintenance and development is imprecise, and that changes to requirements in both scenarios arise due to a combination of factors contributing to requirements uncertainty and events that trigger change. The change trigger taxonomy constructs were initially validated using a small set of requirements change data, and deemed sufficient and practical as a means to collect common requirements change statistics across multiple projects.
Resumo:
Prostatic intraepithelial neoplasia (PIN) diagnosis and grading are affected by uncertainties which arise from the fact that almost all knowledge of PIN histopathology is expressed in concepts, descriptive linguistic terms, and words. A Bayesian belief network (BBN) was therefore used to reduce the problem of uncertainty in diagnostic clue assessment, while still considering the dependences between elements in the reasoning sequence. A shallow network was used with an open-tree topology, with eight first-level descendant nodes for the diagnostic clues (evidence nodes), each independently linked by a conditional probability matrix to a root node containing the diagnostic alternatives (decision node). One of the evidence nodes was based on the tissue architecture and the others were based on cell features. The system was designed to be interactive, in that the histopathologist entered evidence into the network in the form of likelihood ratios for outcomes at each evidence node. The efficiency of the network was tested on a series of 110 prostate specimens, subdivided as follows: 22 cases of non-neoplastic prostate or benign prostatic tissue (NP), 22 PINs of low grade (PINlow), 22 PINs of high grade (PINhigh), 22 prostatic adenocarcinomas with cribriform pattern (PACcri), and 22 prostatic adenocarcinomas with large acinar pattern (PAClgac). The results obtained in the benign and malignant categories showed that the belief for the diagnostic alternatives is very high, the values being in general more than 0.8 and often close to 1.0. When considering the PIN lesions, the network classified and graded most of the cases with high certainty. However, there were some cases which showed values less than 0.8 (13 cases out of 44), thus indicating that there are situations in which the feature changes are intermediate between contiguous categories or grades. Discrepancy between morphological grading and the BBN results was observed in four out of 44 PIN cases: one PINlow was classified as PINhigh and three PINhigh were classified as PINlow. In conclusion, the network can grade PlN lesions and differentiate them from other prostate lesions with certainty. In particular, it offers a descriptive classifier which is readily implemented and which allows the use of linguistic, fuzzy variables.
Resumo:
This paper describes middleware-level support for agent mobility, targeted at hierarchically structured wireless sensor and actuator network applications. Agent mobility enables a dynamic deployment and adaptation of the application on top of the wireless network at runtime, while allowing the middleware to optimize the placement of agents, e.g., to reduce wireless network traffic, transparently to the application programmer. The paper presents the design of the mechanisms and protocols employed to instantiate agents on nodes and to move agents between nodes. It also gives an evaluation of a middleware prototype running on Imote2 nodes that communicate over ZigBee. The results show that our implementation is reasonably efficient and fast enough to support the envisioned functionality on top of a commodity multi-hop wireless technology. Our work is to a large extent platform-neutral, thus it can inform the design of other systems that adopt a hierarchical structuring of mobile components. © 2012 ICST Institute for Computer Science, Social Informatics and Telecommunications Engineering.
Resumo:
At its core, Duverger’s Law—holding that the number of viable parties in first-past-the-post systems should not exceed two—applies primarily at the district level. While the number of parties nationally may exceed two, district-level party system fragmentation should not. Given that a growing body of research shows that district-level party system fragmentation can indeed exceed two in first-past-the-post systems, I explore whether the major alternative explanation for party system fragmentation—the social cleavage approach—can explain such violations of Duverger’s Law. Testing this argument in several West European elections prior to the adoption of proportional representation, I find evidence favouring a social cleavage explanation: with the expansion of the class cleavage, the average district-level party system eventually came to violate the two-party predictions associated with Duverger’s Law. This suggests that sufficient social cleavage diversity may produce multiparty systems in other first-past-the-post systems.
Resumo:
Esta tese pretende contribuir para o estudo e análise dos factores relacionados com as técnicas de aquisição de imagens radiológicas digitais, a qualidade diagnóstica e a gestão da dose de radiação em sistema de radiologia digital. A metodologia encontra-se organizada em duas componentes. A componente observacional, baseada num desenho do estudo de natureza retrospectiva e transversal. Os dados recolhidos a partir de sistemas CR e DR permitiram a avaliação dos parâmetros técnicos de exposição utilizados em radiologia digital, a avaliação da dose absorvida e o índice de exposição no detector. No contexto desta classificação metodológica (retrospectiva e transversal), também foi possível desenvolver estudos da qualidade diagnóstica em sistemas digitais: estudos de observadores a partir de imagens arquivadas no sistema PACS. A componente experimental da tese baseou-se na realização de experiências em fantomas para avaliar a relação entre dose e qualidade de imagem. As experiências efectuadas permitiram caracterizar as propriedades físicas dos sistemas de radiologia digital, através da manipulação das variáveis relacionadas com os parâmetros de exposição e a avaliação da influência destas na dose e na qualidade da imagem. Utilizando um fantoma contrastedetalhe, fantomas antropomórficos e um fantoma de osso animal, foi possível objectivar medidas de quantificação da qualidade diagnóstica e medidas de detectabilidade de objectos. Da investigação efectuada, foi possível salientar algumas conclusões. As medidas quantitativas referentes à performance dos detectores são a base do processo de optimização, permitindo a medição e a determinação dos parâmetros físicos dos sistemas de radiologia digital. Os parâmetros de exposição utilizados na prática clínica mostram que a prática não está em conformidade com o referencial Europeu. Verifica-se a necessidade de avaliar, melhorar e implementar um padrão de referência para o processo de optimização, através de novos referenciais de boa prática ajustados aos sistemas digitais. Os parâmetros de exposição influenciam a dose no paciente, mas a percepção da qualidade de imagem digital não parece afectada com a variação da exposição. Os estudos que se realizaram envolvendo tanto imagens de fantomas como imagens de pacientes mostram que a sobreexposição é um risco potencial em radiologia digital. A avaliação da qualidade diagnóstica das imagens mostrou que com a variação da exposição não se observou degradação substancial da qualidade das imagens quando a redução de dose é efectuada. Propõe-se o estudo e a implementação de novos níveis de referência de diagnóstico ajustados aos sistemas de radiologia digital. Como contributo da tese, é proposto um modelo (STDI) para a optimização de sistemas de radiologia digital.
Resumo:
O objeto principal desta tese é o estudo de algoritmos de processamento e representação automáticos de dados, em particular de informação obtida por sensores montados a bordo de veículos (2D e 3D), com aplicação em contexto de sistemas de apoio à condução. O trabalho foca alguns dos problemas que, quer os sistemas de condução automática (AD), quer os sistemas avançados de apoio à condução (ADAS), enfrentam hoje em dia. O documento é composto por duas partes. A primeira descreve o projeto, construção e desenvolvimento de três protótipos robóticos, incluindo pormenores associados aos sensores montados a bordo dos robôs, algoritmos e arquitecturas de software. Estes robôs foram utilizados como plataformas de ensaios para testar e validar as técnicas propostas. Para além disso, participaram em várias competições de condução autónoma tendo obtido muito bons resultados. A segunda parte deste documento apresenta vários algoritmos empregues na geração de representações intermédias de dados sensoriais. Estes podem ser utilizados para melhorar técnicas já existentes de reconhecimento de padrões, deteção ou navegação, e por este meio contribuir para futuras aplicações no âmbito dos AD ou ADAS. Dado que os veículos autónomos contêm uma grande quantidade de sensores de diferentes naturezas, representações intermédias são particularmente adequadas, pois podem lidar com problemas relacionados com as diversas naturezas dos dados (2D, 3D, fotométrica, etc.), com o carácter assíncrono dos dados (multiplos sensores a enviar dados a diferentes frequências), ou com o alinhamento dos dados (problemas de calibração, diferentes sensores a disponibilizar diferentes medições para um mesmo objeto). Neste âmbito, são propostas novas técnicas para a computação de uma representação multi-câmara multi-modal de transformação de perspectiva inversa, para a execução de correcção de côr entre imagens de forma a obter mosaicos de qualidade, ou para a geração de uma representação de cena baseada em primitivas poligonais, capaz de lidar com grandes quantidades de dados 3D e 2D, tendo inclusivamente a capacidade de refinar a representação à medida que novos dados sensoriais são recebidos.
Resumo:
In the field of control systems it is common to use techniques based on model adaptation to carry out control for plants for which mathematical analysis may be intricate. Increasing interest in biologically inspired learning algorithms for control techniques such as Artificial Neural Networks and Fuzzy Systems is in progress. In this line, this paper gives a perspective on the quality of results given by two different biologically connected learning algorithms for the design of B-spline neural networks (BNN) and fuzzy systems (FS). One approach used is the Genetic Programming (GP) for BNN design and the other is the Bacterial Evolutionary Algorithm (BEA) applied for fuzzy rule extraction. Also, the facility to incorporate a multi-objective approach to the GP algorithm is outlined, enabling the designer to obtain models more adequate for their intended use.
Resumo:
Tese dout., Química, Universidade do Algarve, 2005
Resumo:
Tese de dout., Ciências do Mar, da Terra e do Ambiente (Ciências do Mar-Oceanografia Física), Faculdade de Ciências e Tecnologia, Univ. do Algarve, 2011
Resumo:
The activity of Control Center operators is important to guarantee the effective performance of Power Systems. Operators’ actions are crucial to deal with incidents, especially severe faults like blackouts. In this paper, we present an Intelligent Tutoring approach for training Portuguese Control Center operators in tasks like incident analysis and diagnosis, and service restoration of Power Systems. Intelligent Tutoring System (ITS) approach is used in the training of the operators, having into account context awareness and the unobtrusive integration in the working environment. Several Artificial Intelligence techniques were criteriously used and combined together to obtain an effective Intelligent Tutoring environment, namely Multiagent Systems, Neural Networks, Constraint-based Modeling, Intelligent Planning, Knowledge Representation, Expert Systems, User Modeling, and Intelligent User Interfaces.
Resumo:
It is generally challenging to determine end-to-end delays of applications for maximizing the aggregate system utility subject to timing constraints. Many practical approaches suggest the use of intermediate deadline of tasks in order to control and upper-bound their end-to-end delays. This paper proposes a unified framework for different time-sensitive, global optimization problems, and solves them in a distributed manner using Lagrangian duality. The framework uses global viewpoints to assign intermediate deadlines, taking resource contention among tasks into consideration. For soft real-time tasks, the proposed framework effectively addresses the deadline assignment problem while maximizing the aggregate quality of service. For hard real-time tasks, we show that existing heuristic solutions to the deadline assignment problem can be incorporated into the proposed framework, enriching their mathematical interpretation.