905 resultados para pacs: expert systems and other ai software and techniques
Resumo:
Since the last decade the problem of surface inspection has been receiving great attention from the scientific community, the quality control and the maintenance of products are key points in several industrial applications.The railway associations spent much money to check the railway infrastructure. The railway infrastructure is a particular field in which the periodical surface inspection can help the operator to prevent critical situations. The maintenance and monitoring of this infrastructure is an important aspect for railway association.That is why the surface inspection of railway also makes importance to the railroad authority to investigate track components, identify problems and finding out the way that how to solve these problems. In railway industry, usually the problems find in railway sleepers, overhead, fastener, rail head, switching and crossing and in ballast section as well. In this thesis work, I have reviewed some research papers based on AI techniques together with NDT techniques which are able to collect data from the test object without making any damage. The research works which I have reviewed and demonstrated that by adopting the AI based system, it is almost possible to solve all the problems and this system is very much reliable and efficient for diagnose problems of this transportation domain. I have reviewed solutions provided by different companies based on AI techniques, their products and reviewed some white papers provided by some of those companies. AI based techniques likemachine vision, stereo vision, laser based techniques and neural network are used in most cases to solve the problems which are performed by the railway engineers.The problems in railway handled by the AI based techniques performed by NDT approach which is a very broad, interdisciplinary field that plays a critical role in assuring that structural components and systems perform their function in a reliable and cost effective fashion. The NDT approach ensures the uniformity, quality and serviceability of materials without causing any damage of that materials is being tested. This testing methods use some way to test product like, Visual and Optical testing, Radiography, Magnetic particle testing, Ultrasonic testing, Penetrate testing, electro mechanic testing and acoustic emission testing etc. The inspection procedure has done periodically because of better maintenance. This inspection procedure done by the railway engineers manually with the aid of AI based techniques.The main idea of thesis work is to demonstrate how the problems can be reduced of thistransportation area based on the works done by different researchers and companies. And I have also provided some ideas and comments according to those works and trying to provide some proposal to use better inspection method where it is needed.The scope of this thesis work is automatic interpretation of data from NDT, with the goal of detecting flaws accurately and efficiently. AI techniques such as neural networks, machine vision, knowledge-based systems and fuzzy logic were applied to a wide spectrum of problems in this area. Another scope is to provide an insight into possible research methods concerning railway sleeper, fastener, ballast and overhead inspection by automatic interpretation of data.In this thesis work, I have discussed about problems which are arise in railway sleepers,fastener, and overhead and ballasted track. For this reason I have reviewed some research papers related with these areas and demonstrated how their systems works and the results of those systems. After all the demonstrations were taking place of the advantages of using AI techniques in contrast with those manual systems exist previously.This work aims to summarize the findings of a large number of research papers deploying artificial intelligence (AI) techniques for the automatic interpretation of data from nondestructive testing (NDT). Problems in rail transport domain are mainly discussed in this work. The overall work of this paper goes to the inspection of railway sleepers, fastener, ballast and overhead.
Resumo:
In a northern European climate a typical solar combisystem for a single family house normally saves between 10 and 30 % of the auxiliary energy needed for space heating and domestic water heating. It is considered uneconomical to dimension systems for higher energy savings. Overheating problems may also occur. One way of avoiding these problems is to use a collector that is designed so that it has a low optical efficiency in summer, when the solar elevation is high and the load is small, and a high optical efficiency in early spring and late fall when the solar elevation is low and the load is large.The study investigates the possibilities to design the system and, in particular, the collector optics, in order to match the system performance with the yearly variations of the heating load and the solar irradiation. It seems possible to design practically viable load adapted collectors, and to use them for whole roofs ( 40 m2) without causing more overheating stress on the system than with a standard 10 m2 system. The load adapted collectors collect roughly as much energy per unit area as flat plate collectors, but they may be produced at a lower cost due to lower material costs. There is an additional potential for a cost reduction since it is possible to design the load adapted collector for low stagnation temperatures making it possible to use less expensive materials. One and the same collector design is suitable for a wide range of system sizes and roof inclinations. The report contains descriptions of optimized collector designs, properties of realistic collectors, and results of calculations of system output, stagnation performance and cost performance. Appropriate computer tools for optical analysis, optimization of collectors in systems and a very fast simulation model have been developed.
Resumo:
Emissions are an important aspect of a pellet heating system. High carbon monoxide emissions are often caused by unnecessary cycling of the burner when the burner is operated below the lowest combustion power. Combining pellet heating systems with a solar heating system can significantly reduce cycling of the pellet heater and avoid the inefficient summer operation of the pellet heater. The aim of this paper was to study CO-emissions of the different types of systems and to compare the yearly CO-emissions obtained from simulations with the yearly CO-emissions calculated based on the values that are obtained by the standard test methods. The results showed that the yearly CO-emissions obtained from the simulations are significant higher than the yearly CO-emissions calculated based on the standard test methods. It is also shown that for the studied systems the average emissions under these realistic annual conditions were greater than the limit values of two Eco-labels. Furthermore it could be seen that is possible to almost halve the CO-emission if the pellet heater is combined with a solar heating system.
Resumo:
With the building sector accounting for around 40% of the total energy consumption in the EU, energy efficiency in buildings is and continues to be an important issue. Great progress has been made in reducing the energy consumption in new buildings, but the large stock of existing buildings with poor energy performance is probably an even more crucial area of focus. This thesis deals with energy efficiency measures that can be suitable for renovation of existing houses, particularly low-temperature heating systems and ventilation systems with heat recovery. The energy performance, environmental impact and costs are evaluated for a range of system combinations, for small and large houses with various heating demands and for different climates in Europe. The results were derived through simulation with energy calculation tools. Low-temperature heating and air heat recovery were both found to be promising with regard to increasing energy efficiency in European houses. These solutions proved particularly effective in Northern Europe as low-temperature heating and air heat recovery have a greater impact in cold climates and on houses with high heating demands. The performance of heat pumps, both with outdoor air and exhaust air, was seen to improve with low-temperature heating. The choice between an exhaust air heat pump and a ventilation system with heat recovery is likely to depend on case specific conditions, but both choices are more cost-effective and have a lower environmental impact than systems without heat recovery. The advantage of the heat pump is that it can be used all year round, given that it produces DHW. Economic and environmental aspects of energy efficiency measures do not always harmonize. On the one hand, lower costs can sometimes mean larger environmental impact; on the other hand there can be divergence between different environmental aspects. This makes it difficult to define financial subsidies to promote energy efficiency measures.
Resumo:
The Grid is a large-scale computer system that is capable of coordinating resources that are not subject to centralised control, whilst using standard, open, general-purpose protocols and interfaces, and delivering non-trivial qualities of service. In this chapter, we argue that Grid applications very strongly suggest the use of agent-based computing, and we review key uses of agent technologies in Grids: user agents, able to customize and personalise data; agent communication languages offering a generic and portable communication medium; and negotiation allowing multiple distributed entities to reach service level agreements. In the second part of the chapter, we focus on Grid service discovery, which we have identified as a prime candidate for use of agent technologies: we show that Grid-services need to be located via personalised, semantic-rich discovery processes, which must rely on the storage of arbitrary metadata about services that originates from both service providers and service users. We present UDDI-MT, an extension to the standard UDDI service directory approach that supports the storage of such metadata via a tunnelling technique that ties the metadata store to the original UDDI directory. The outcome is a flexible service registry which is compatible with existing standards and also provides metadata-enhanced service discovery.
Resumo:
This study looks at the historical context in which PACs developed, as well as the current legal environment in which they operate. It will also briefly discuss the legal and procedural challenges that candidates face and the ways in which PACs alleviate some of these pressures in ways that presidential committees cannot. An understanding of the strategic dilemmas which cause candidates to seek extraneous structures through which to establish campaign networks is essential to extrapolating the potential future of campaign finance strategy. Furthermore, this study provides an in-depth analysis of the state Commonwealth PACs both in terms of fundraising and spending, and discusses the central issues this state PAC strategy raises with respect to campaign finance law. The study will conclude with a look into the future of campaign financing and the role these state-level PACs may play if current rules are not revised.
Resumo:
The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.
Resumo:
Generalized hyper competitiveness in the world markets has determined the need to offer better products to potential and actual clients in order to mark an advantagefrom other competitors. To ensure the production of an adequate product, enterprises need to work on the efficiency and efficacy of their business processes (BPs) by means of the construction of Interactive Information Systems (IISs, including Interactive Multimedia Documents) so that they are processed more fluidly and correctly.The construction of the correct IIS is a major task that can only be successful if the needs from every intervenient are taken into account. Their requirements must bedefined with precision, extensively analyzed and consequently the system must be accurately designed in order to minimize implementation problems so that the IIS isproduced on schedule and with the fewer mistakes as possible. The main contribution of this thesis is the proposal of Goals, a software (engineering) construction process which aims at defining the tasks to be carried out in order to develop software. This process defines the stakeholders, the artifacts, and the techniques that should be applied to achieve correctness of the IIS. Complementarily, this process suggests two methodologies to be applied in the initial phases of the lifecycle of the Software Engineering process: Process Use Cases for the phase of requirements, and; MultiGoals for the phases of analysis and design. Process Use Cases is a UML-based (Unified Modeling Language), goal-driven and use case oriented methodology for the definition of functional requirements. It uses an information oriented strategy in order to identify BPs while constructing the enterprise’s information structure, and finalizes with the identification of use cases within the design of these BPs. This approach provides a useful tool for both activities of Business Process Management and Software Engineering. MultiGoals is a UML-based, use case-driven and architectural centric methodology for the analysis and design of IISs with support for Multimedia. It proposes the analysis of user tasks as the basis of the design of the: (i) user interface; (ii) the system behaviour that is modeled by means of patterns which can combine Multimedia and standard information, and; (iii) the database and media contents. This thesis makes the theoretic presentation of these approaches accompanied with examples from a real project which provide the necessary support for the understanding of the used techniques.
Resumo:
The objective of this study was to elucidate population fluctuations of spider and ant species in forest fragments and adjacent soybean and corn crops under no-tillage and conventional tillage systems, and their correlations with meteorological factors. From Nov 2004 to Apr 2007 sampling of these arthropods at Guaira, São Paulo state was done biweekly during the cropping season and monthly during the periods between crops. To obtain samples at each experimental site, pitfall traps were distributed in 2 transects of 200 m of which 100 m was in the crop, and 100 m was in the forest fragment. Temperature and rainfall were found to have major impacts on fluctuations in population densities of ants of the genus, Pheidole, in soybean and corn crops both grown with conventional tillage and no tillage systems.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Service provisioning is a challenging research area for the design and implementation of autonomic service-oriented software systems. It includes automated QoS management for such systems and their applications. Monitoring, Diagnosis and Repair are three key features of QoS management. This work presents a self-healing Web service-based framework that manages QoS degradation at runtime. Our approach is based on proxies. Proxies act on meta-level communications and extend the HTTP envelope of the exchanged messages with QoS-related parameter values. QoS Data are filtered over time and analysed using statistical functions and the Hidden Markov Model. Detected QoS degradations are handled with proxies. We experienced our framework using an orchestrated electronic shop application (FoodShop).
Resumo:
One objective of the feeder reconfiguration problem in distribution systems is to minimize the power losses for a specific load. For this problem, mathematical modeling is a nonlinear mixed integer problem that is generally hard to solve. This paper proposes an algorithm based on artificial neural network theory. In this context, clustering techniques to determine the best training set for a single neural network with generalization ability are also presented. The proposed methodology was employed for solving two electrical systems and presented good results. Moreover, the methodology can be employed for large-scale systems in real-time environment.
Resumo:
This paper proposes a methodology to achieve integrated planning and projects for secondary distribution circuits. The planning model is formulated as a mixed integer nonlinear programming problem (MINLP). In order to resolve this problem, a tabu search (TS) algorithm is used, with a neighborhood structure developed to explore the physical characteristics of specific geographies included in the planning and expansion of secondary networks, thus obtaining effective solutions as well as low operating costs and investments. The project stage of secondary circuits consists of calculating the mechanical efforts to determine the support structures of the primary and secondary distribution systems and determining the types of structures that should be used in the system according to topological and electrical parameters of the network and, therefore, accurately assessing the costs involved in the construction and/or reform of secondary systems. A constructive heuristic based on information of the electrical and topological conditions between the medium voltage and low voltage systems is used to connect the primary systems and secondary circuits. The results obtained from planning and design simulations of a real secondary system of electric energy distribution are presented.
Resumo:
Background: Leptospirosis is an important zoonotic disease associated with poor areas of urban settings of developing countries and early diagnosis and prompt treatment may prevent disease. Although rodents are reportedly considered the main reservoirs of leptospirosis, dogs may develop the disease, may become asymptomatic carriers and may be used as sentinels for disease epidemiology. The use of Geographical Information Systems (GIS) combined with spatial analysis techniques allows the mapping of the disease and the identification and assessment of health risk factors. Besides the use of GIS and spatial analysis, the technique of data mining, decision tree, can provide a great potential to find a pattern in the behavior of the variables that determine the occurrence of leptospirosis. The objective of the present study was to apply Geographical Information Systems and data prospection (decision tree) to evaluate the risk factors for canine leptospirosis in an area of Curitiba, PR.Materials, Methods & Results: The present study was performed on the Vila Pantanal, a urban poor community in the city of Curitiba. A total of 287 dog blood samples were randomly obtained house-by-house in a two-day sampling on January 2010. In addition, a questionnaire was applied to owners at the time of sampling. Geographical coordinates related to each household of tested dog were obtained using a Global Positioning System (GPS) for mapping the spatial distribution of reagent and non-reagent dogs to leptospirosis. For the decision tree, risk factors included results of microagglutination test (MAT) from the serum of dogs, previous disease on the household, contact with rats or other dogs, dog breed, outdoors access, feeding, trash around house or backyard, open sewer proximity and flooding. A total of 189 samples (about 2/3 of overall samples) were randomly selected for the training file and consequent decision rules. The remained 98 samples were used for the testing file. The seroprevalence showed a pattern of spatial distribution that involved all the Pantanal area, without agglomeration of reagent animals. In relation to data mining, from 189 samples used in decision tree, a total of 165 (87.3%) animal samples were correctly classified, generating a Kappa index of 0.413. A total of 154 out of 159 (96.8%) samples were considered non-reagent and were correctly classified and only 5/159 (3.2%) were wrongly identified. on the other hand, only 11 (36.7%) reagent samples were correctly classified, with 19 (63.3%) samples failing diagnosis.Discussion: The spatial distribution that involved all the Pantanal area showed that all the animals in the area are at risk of contamination by Leptospira spp. Although most samples had been classified correctly by the decision tree, a degree of difficulty of separability related to seropositive animals was observed, with only 36.7% of the samples classified correctly. This can occur due to the fact of seronegative animals number is superior to the number of seropositive ones, taking the differences in the pattern of variable behavior. The data mining helped to evaluate the most important risk factors for leptospirosis in an urban poor community of Curitiba. The variables selected by decision tree reflected the important factors about the existence of the disease (default of sewer, presence of rats and rubbish and dogs with free access to street). The analyses showed the multifactorial character of the epidemiology of canine leptospirosis.