843 resultados para pacs: expert systems and other ai software and techniques


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Sweden, there are about 0.5 million single-family houses that are heated by electricity alone, and rising electricity costs force the conversion to other heating sources such as heat pumps and wood pellet heating systems. Pellet heating systems for single-family houses are currently a strongly growing market. Future lack of wood fuels is possible even in Sweden, and combining wood pellet heating with solar heating will help to save the bio-fuel resources. The objectives of this thesis are to investigate how the electrically heated single-family houses can be converted to pellet and solar heating systems, and how the annual efficiency and solar gains can be increased in such systems. The possible reduction of CO-emissions by combining pellet heating with solar heating has also been investigated. Systems with pellet stoves (both with and without a water jacket), pellet boilers and solar heating have been simulated. Different system concepts have been compared in order to investigate the most promising solutions. Modifications in system design and control strategies have been carried out in order to increase the system efficiency and the solar gains. Possibilities for increasing the solar gains have been limited to investigation of DHW-units for hot water production and the use of hot water for heating of dishwashers and washing machines via a heat exchanger instead of electricity (heat-fed appliances). Computer models of pellet stoves, boilers, DHW-units and heat-fed appliances have been developed and the parameters for the models have been identified from measurements on real components. The conformity between the models and the measurements has been checked. The systems with wood pellet stoves have been simulated in three different multi-zone buildings, simulated in detail with heat distribution through door openings between the zones. For the other simulations, either a single-zone house model or a load file has been used. Simulations were carried out for Stockholm, Sweden, but for the simulations with heat-fed machines also for Miami, USA. The foremost result of this thesis is the increased understanding of the dynamic operation of combined pellet and solar heating systems for single-family houses. The results show that electricity savings and annual system efficiency is strongly affected by the system design and the control strategy. Large reductions in pellet consumption are possible by combining pellet boilers with solar heating (a reduction larger than the solar gains if the system is properly designed). In addition, large reductions in carbon monoxide emissions are possible. To achieve these reductions it is required that the hot water production and the connection of the radiator circuit is moved to a well insulated, solar heated buffer store so that the boiler can be turned off during the periods when the solar collectors cover the heating demand. The amount of electricity replaced using systems with pellet stoves is very dependant on the house plan, the system design, if internal doors are open or closed and the comfort requirements. Proper system design and control strategies are crucial to obtain high electricity savings and high comfort with pellet stove systems. The investigated technologies for increasing the solar gains (DHW-units and heat-fed appliances) significantly increase the solar gains, but for the heat-fed appliances the market introduction is difficult due to the limited financial savings and the need for a new heat distribution system. The applications closest to market introduction could be for communal laundries and for use in sunny climates where the dominating part of the heat can be covered by solar heating. The DHW-unit is economical but competes with the internal finned-tube heat exchanger which is the totally dominating technology for hot water preparation in solar combisystems for single-family houses.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the past, the focus of drainage design was on sizing pipes and storages in order to provide sufficient network capacity. This traditional approach, together with computer software and technical guidance, had been successful for many years. However, due to rapid population growth and urbanisation, the requirements of a “good” drainage design have also changed significantly. In addition to water management, other aspects such as environmental impacts, amenity values and carbon footprint have to be considered during the design process. Going forward, we need to address the key sustainability issues carefully and practically. The key challenge of moving from simple objectives (e.g. capacity and costs) to complicated objectives (e.g. capacity, flood risk, environment, amenity etc) is the difficulty to strike a balance between various objectives and to justify potential benefits and compromises. In order to assist decision makers, we developed a new decision support system for drainage design. The system consists of two main components – a multi-criteria evaluation framework for drainage systems and a multi-objective optimisation tool. The evaluation framework is used for the quantification of performance, life-cycle costs and benefits of different drainage systems. The optimisation tool can search for feasible combinations of design parameters such as the sizes, order and type of drainage components that maximise multiple benefits. In this paper, we will discuss real-world application of the decision support system. A number of case studies have been developed based on recent drainage projects in China. We will use the case studies to illustrate how the evaluation framework highlights and compares the pros and cons of various design options. We will also discuss how the design parameters can be optimised based on the preferences of decision makers. The work described here is the output of an EngD project funded by EPSRC and XP Solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the constant grow of enterprises and the need to share information across departments and business areas becomes more critical, companies are turning to integration to provide a method for interconnecting heterogeneous, distributed and autonomous systems. Whether the sales application needs to interface with the inventory application, the procurement application connect to an auction site, it seems that any application can be made better by integrating it with other applications. Integration between applications can face several troublesome due the fact that applications may not have been designed and implemented having integration in mind. Regarding to integration issues, two tier software systems, composed by the database tier and by the “front-end” tier (interface), have shown some limitations. As a solution to overcome the two tier limitations, three tier systems were proposed in the literature. Thus, by adding a middle-tier (referred as middleware) between the database tier and the “front-end” tier (or simply referred application), three main benefits emerge. The first benefit is related with the fact that the division of software systems in three tiers enables increased integration capabilities with other systems. The second benefit is related with the fact that any modifications to the individual tiers may be carried out without necessarily affecting the other tiers and integrated systems and the third benefit, consequence of the others, is related with less maintenance tasks in software system and in all integrated systems. Concerning software development in three tiers, this dissertation focus on two emerging technologies, Semantic Web and Service Oriented Architecture, combined with middleware. These two technologies blended with middleware, which resulted in the development of Swoat framework (Service and Semantic Web Oriented ArchiTecture), lead to the following four synergic advantages: (1) allow the creation of loosely-coupled systems, decoupling the database from “front-end” tiers, therefore reducing maintenance; (2) the database schema is transparent to “front-end” tiers which are aware of the information model (or domain model) that describes what data is accessible; (3) integration with other heterogeneous systems is allowed by providing services provided by the middleware; (4) the service request by the “frontend” tier focus on ‘what’ data and not on ‘where’ and ‘how’ related issues, reducing this way the application development time by developers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effects of agricultural-pastoral and tillage practices on soil microbial populations and activities have not been systematically investigated. The effect of no-tillage (NT), no-tillage agricultural-pastoral integrated systems (NT-I) and conventional tillage (CT) at soil depths of 0-10, 10-20 and 20-30 cm on the microbial populations (bacteria and fungi), biomass-C, potential nitrification, urease and protease activities, total organic matter and total N contents were investigated. The crops used were soybean (in NT, NT-I and CT systems), corn (in NT and NT-I systems) and Tanner grass (Brachiaria sp.) (in NT-I system); a forest system was used as a control. Urease and protease activities, biomass-C and the content of organic matter and total N were higher (p < 0.05) in the forest soil than the other soils. Potential nitrification was significantly higher in the NT-I system in comparison with the other systems. Bacteria numbers were similar in all systems. Fungi counts were similar in the CT and forest, but both were higher than in NT. All of these variables were dependent on the organic matter content and decreased (p < 0.05) from the upper soil layer to the deeper soil layers. These results indicate that the no-tillage agricultural-pasture-integrated systems may be useful for soil conservation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Malware has become a major threat in the last years due to the ease of spread through the Internet. Malware detection has become difficult with the use of compression, polymorphic methods and techniques to detect and disable security software. Those and other obfuscation techniques pose a problem for detection and classification schemes that analyze malware behavior. In this paper we propose a distributed architecture to improve malware collection using different honeypot technologies to increase the variety of malware collected. We also present a daemon tool developed to grab malware distributed through spam and a pre-classification technique that uses antivirus technology to separate malware in generic classes. © 2009 SPIE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a control method for a class of continuous-time switched systems, using state feedback variable structure controllers. The method is applied to the control of a non-trivial dc-dc power converter and a simple and inexpensive control circuit design, that was simulated using the software PSpice, is proposed. The design is based on Lyapunov-Metzler-SPR systems and the performance of the resulting control system is superior to that afforded by a recently proposed alternative sliding-mode control technique. © 2011 IFAC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Secondary phases such as Laves and carbides are formed during the final solidification stages of nickel based superalloy coatings deposited during the gas tungsten arc welding cold wire process. However, when aged at high temperatures, other phases can precipitate in the microstructure, like the γ″ and δ phases. This work presents a new application and evaluation of artificial intelligent techniques to classify (the background echo and backscattered) ultrasound signals in order to characterize the microstructure of a Ni-based alloy thermally aged at 650 and 950 °C for 10, 100 and 200 h. The background echo and backscattered ultrasound signals were acquired using transducers with frequencies of 4 and 5 MHz. Thus with the use of features extraction techniques, i.e.; detrended fluctuation analysis and the Hurst method, the accuracy and speed in the classification of the secondary phases from ultrasound signals could be studied. The classifiers under study were the recent optimum-path forest (OPF) and the more traditional support vector machines and Bayesian. The experimental results revealed that the OPF classifier was the fastest and most reliable. In addition, the OPF classifier revealed to be a valid and adequate tool for microstructure characterization through ultrasound signals classification due to its speed, sensitivity, accuracy and reliability. © 2013 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Breast cancer is the most common cancer among women. In CAD systems, several studies have investigated the use of wavelet transform as a multiresolution analysis tool for texture analysis and could be interpreted as inputs to a classifier. In classification, polynomial classifier has been used due to the advantages of providing only one model for optimal separation of classes and to consider this as the solution of the problem. In this paper, a system is proposed for texture analysis and classification of lesions in mammographic images. Multiresolution analysis features were extracted from the region of interest of a given image. These features were computed based on three different wavelet functions, Daubechies 8, Symlet 8 and bi-orthogonal 3.7. For classification, we used the polynomial classification algorithm to define the mammogram images as normal or abnormal. We also made a comparison with other artificial intelligence algorithms (Decision Tree, SVM, K-NN). A Receiver Operating Characteristics (ROC) curve is used to evaluate the performance of the proposed system. Our system is evaluated using 360 digitized mammograms from DDSM database and the result shows that the algorithm has an area under the ROC curve Az of 0.98 ± 0.03. The performance of the polynomial classifier has proved to be better in comparison to other classification algorithms. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Caribbean policymakers are faced with special challenges from climate change and these are related to the uncertainties inherent in future climate projections and the complex linkages among climate change, physical and biological systems and socioeconomic sectors. The impacts of climate change threaten development in the Caribbean and may well erode previous gains in development as evidenced by the increased incidence of climate migrants internationally. This brief which is based on a recent study conducted by the Economic Commission for Latin America and the Caribbean (LC/CAR/L.395)1 provides a synthesis of the assessment of the economic and social impacts of climate change on the coastal and marine sector in the Caribbean which were undertaken. It provides Caribbean policymakers with cutting-edge information on the region’s vulnerability and encourages the development of adaptation strategies informed by both local experience and expert knowledge. It proceeds from an acknowledgement that the unique combination of natural resources, ecosystems, economic activities, and human population settlements of the Caribbean will not be immune to the impacts of climate change, and local communities, countries and the subregion as a whole need to plan for, and adapt to, these effects. Climate and extreme weather hazards related to the coastal and marine sector encompass the distinct but related factors of sea level rise, increasing coastal water temperatures, tropical storms and hurricanes. Potential vulnerabilities for coastal zones include increased shoreline erosion leading to alteration of the coastline, loss of coastal wetlands, and changes in the abundance and diversity of fish and other marine populations. The study examines four key themes in the analysis: climate, vulnerability, economic and social costs associated with climate change impacts, and adaptive measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is twofold: to analyze the computational complexity of the cogeneration design problem; to present an expert system to solve the proposed problem, comparing such an approach with the traditional searching methods available.Design/methodology/approach - The complexity of the cogeneration problem is analyzed through the transformation of the well-known knapsack problem. Both problems are formulated as decision problems and it is proven that the cogeneration problem is np-complete. Thus, several searching approaches, such as population heuristics and dynamic programming, could be used to solve the problem. Alternatively, a knowledge-based approach is proposed by presenting an expert system and its knowledge representation scheme.Findings - The expert system is executed considering two case-studies. First, a cogeneration plant should meet power, steam, chilled water and hot water demands. The expert system presented two different solutions based on high complexity thermodynamic cycles. In the second case-study the plant should meet just power and steam demands. The system presents three different solutions, and one of them was never considered before by our consultant expert.Originality/value - The expert system approach is not a "blind" method, i.e. it generates solutions based on actual engineering knowledge instead of the searching strategies from traditional methods. It means that the system is able to explain its choices, making available the design rationale for each solution. This is the main advantage of the expert system approach over the traditional search methods. On the other hand, the expert system quite likely does not provide an actual optimal solution. All it can provide is one or more acceptable solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper refers to the design of an expert system that captures a waveform through the use of an accelerometer, processes the signal and converts it to the frequency domain using a Fast Fourier Transformer to then, using artificial intelligence techniques, specifically Fuzzy Reasoning, it determines if there is any failure present in the underlying mode of the equipment, such as imbalance, misalignment or bearing defects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The immobilization of metal nanoparticles in magnetic responsive solids allows the easy, fast, and clean separation of catalysts; however, the efficiency of this separation process depends on a strong metalsupport interaction. This interaction can be enhanced by functionalizing the support surface with amino groups. Our catalyst support contains an inner core of magnetite that enables the magnetic separation from liquid systems and an external surface of silica suitable for further modification with organosilanes. We report herein that a magnetically recoverable amino-functionalized support captured iridium species from liquid solutions and produced a highly active hydrogenation catalyst with negligible metal leaching. An analogous Ir0 catalyst prepared with use of a nonfunctionalized support shows a higher degree of metal leaching into the liquid products. The catalytic performance in the hydrogenation of alkenes is compared with that of Rh and Pt catalysts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.