925 resultados para Automation and robotics
Resumo:
Testing of a complex software is time consuming. Automated tools are available quite a lot for desktop applications, but for embedded systems a custom-made tool is required Building a complete test framework is a complicated task. Therefore, the test platform was built on top of an already existing tool, CANoe. CANoe is a tool for CAN bus analysis and node simulation. The functionality of CANoe was extended with LabVIEW DLL. The LabVIEW software was used for simulating hardware components of the embedded device As a result of the study, a platform was created where tests could be automated. Of the current test plan, 10 percent were automated and up to 60 percent could be automated with the current functionality.
Resumo:
Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.
Resumo:
Rapid manufacturing is an advanced manufacturing technology based on layer-by-layer manufacturing to produce a part. This paper presents experimental work carried out to investigate the effects of scan speed, layer thickness, and building direction on the following part features: dimensional error, surface roughness, and mechanical properties for DMLS with DS H20 powder and SLM with CL 20 powder (1.4404/AISI 316L). Findings were evaluated using ANOVA analysis. According to the experimental results, build direction has a significant effect on part quality, in terms of dimensional error and surface roughness. For the SLM process, the build direction has no influence on mechanical properties. Results of this research support industry estimating part quality and mechanical properties before the production of parts with additive manufacturing, using iron-based powders
Resumo:
Peer-reviewed
Resumo:
The target of this thesis is to find out potential of automation maintenance services in Russian industry, especially in the region of St. Petersburg. At the beginning of this study the industrial maintainability and process efficiency are discussed from the point of view of process automation. A survey of the present technology and maintenance methods has been made during five visits to local plants. The results of the interviews are analyzed numerically to clarify the common needs and the potential of automation maintenance services. The most interesting services are evaluated by their required resources to find economically justified solutions for the needs of the industry. As results of this study, some service products that would interest interviewed companies have been introduced. These could be offered to the industry to enhance cost-efficiency and productivity of processes.
Resumo:
Simultaneous localization and mapping(SLAM) is a very important problem in mobile robotics. Many solutions have been proposed by different scientists during the last two decades, nevertheless few studies have considered the use of multiple sensors simultane¬ously. The solution is on combining several data sources with the aid of an Extended Kalman Filter (EKF). Two approaches are proposed. The first one is to use the ordinary EKF SLAM algorithm for each data source separately in parallel and then at the end of each step, fuse the results into one solution. Another proposed approach is the use of multiple data sources simultaneously in a single filter. The comparison of the computational com¬plexity of the two methods is also presented. The first method is almost four times faster than the second one.
Resumo:
Coverage Path Planning (CPP) is the task of determining a path that passes over all points of an area or volume of interest while avoiding obstacles. This task is integral to many robotic applications, such as vacuum cleaning robots, painter robots, autonomous underwater vehicles creating image mosaics, demining robots, lawn mowers, automated harvesters, window cleaners and inspection of complex structures, just to name a few. A considerable body of research has addressed the CPP problem. However, no updated surveys on CPP reflecting recent advances in the field have been presented in the past ten years. In this paper, we present a review of the most successful CPP methods, focusing on the achievements made in the past decade. Furthermore, we discuss reported field applications of the described CPP methods. This work aims to become a starting point for researchers who are initiating their endeavors in CPP. Likewise, this work aims to present a comprehensive review of the recent breakthroughs in the field, providing links to the most interesting and successful works
Resumo:
Editorial material
Resumo:
The central goal of food safety policy in the European Union (EU) is to protect consumer health by guaranteeing a high level of food safety throughout the food chain. This goal can in part be achieved by testing foodstuffs for the presence of various chemical and biological hazards. The aim of this study was to facilitate food safety testing by providing rapid and user-friendly methods for the detection of particular food-related hazards. Heterogeneous competitive time-resolved fluoroimmunoassays were developed for the detection of selected veterinary residues, that is coccidiostat residues, in eggs and chicken liver. After a simplified sample preparation procedure, the immunoassays were performed either in manual format with dissociation-enhanced measurement or in automated format with pre-dried assay reagents and surface measurement. Although the assays were primarily designed for screening purposes providing only qualitative results, they could also be used in a quantitative mode. All the developed assays had good performance characteristics enabling reliable screening of samples at concentration levels required by the authorities. A novel polymerase chain reaction (PCR)-based assay system was developed for the detection of Salmonella spp. in food. The sample preparation included a short non-selective pre-enrichment step, after which the target cells were collected with immunomagnetic beads and applied to PCR reaction vessels containing all the reagents required for the assay in dry form. The homogeneous PCR assay was performed with a novel instrument platform, GenomEra™, and the qualitative assay results were automatically interpreted based on end-point time-resolved fluorescence measurements and cut-off values. The assay was validated using various food matrices spiked with sub-lethally injured Salmonella cells at levels of 1-10 colony forming units (CFU)/25 g of food. The main advantage of the system was the exceptionally short time to result; the entire process starting from the pre-enrichment and ending with the PCR result could be completed in eight hours. In conclusion, molecular methods using state-of-the-art assay techniques were developed for food safety testing. The combination of time-resolved fluorescence detection and ready-to-use reagents enabled sensitive assays easily amenable to automation. Consequently, together with the simplified sample preparation, these methods could prove to be applicable in routine testing.
Resumo:
Workflow management systems aim at the controlled execution of complex application processes in distributed and heterogeneous environments. These systems will shape the structure of information systems in business and non-business environments. E business and system integration is a fertile soil for WF and groupware tools. This thesis aims to study WF and groupware tools in order to gather in house knowledge of WF to better utilize WF solutions in future, and to focus on SAP Business Workflow in order to find a global solution for Application Link Enabling support for system integration. Piloting this solution in Nokia collects the experience of SAP R/3 WF tool for other development projects in future. The literary part of this study will guide to the world of business process automation providing a general description of the history, use and potentials of WF & groupware software. The empirical part of this study begins with the background of the case study describing the IT environment initiating the case by introducing SAP R/3 in Nokia, the communication technique in use and WF tool. Case study is focused in one solution with SAP Business Workflow. This study provides a concept to monitor communication between ERP systems and to increase the quality of system integration. Case study describes a way create support model for ALE/EDI interfaces. Support model includes monitoring organization and the workflow processes to solve the most common IDoc related errors.
Resumo:
Software testing is one of the essential parts in software engineering process. The objective of the study was to describe software testing tools and the corresponding use. The thesis contains examples of software testing tools usage. The study was conducted as a literature study, with focus on current software testing practices and quality assurance standards. In the paper a tool classifier was employed, and testing tools presented in study were classified according to it. We found that it is difficult to distinguish current available tools by certain testing activities as many of them contain functionality that exceeds scopes of a single testing type.
Resumo:
Robotin ohjelmointi on aikaa vievää ja tarvitsee robotin ohjelmoinnin tuntevan operaattorin toimimaan robotin opettajana. Saadakseen robottisolun kustannustehokkaaksi operaattorilla olisi hyvä olla useampi solu hoidettavanaan samaan aikaan. Tämä ei ole suuri ongelma suurille yrityksille, joissa voi olla kymmeniä robottisoluja. Jos kyseessä on pieni tai keskisuuri yritys, automatisointi-investointi voi jäädä tekemättä ohjelmoinnin vaikeuden aiheuttaman ongelman vuoksi. Diplomityössä keskityttiin tutkimaan robotisointia pienten ja keskisuurten yritysten kannalta. Teoriaosassa on keskitytty robottisolun suunnittelun kannalta tarvittaviin perustietoihin robotin rakenteesta, ohjausjärjestelmästä, ohjelmoinnista sekä turvallisuudesta. Näiden perustietojen lisäksi on huomioitu hitsauksen automatisointia sekä taluttamalla ohjelmoitavan robottisolun tekninen konsepti. Taluttamalla ohjelmoitavan robottisolun konseptin käsittelyosassa on myös perehdytty taluttamalla ohjelmoinnin vaatimiin komponentteihin kuten voima/vääntö-anturi. Robottisolun suunnittelu on tehtävä koneasetuksen vaatimusten mukaan. Turvallisuus osiossa on käsitelty koneasetuksen vaatimuksia koneensuunnittelulle ja käytännön osassa on käsitelty Winnovan taluttamalla ohjelmoitavan robottisolun suunnittelua koneasetuksen ohjeiden mukaan. Käytännön osassa on tutkittu taluttamalla ohjelmoinnin tuomia etuja muihin ohjelmointimenetelmiin nähden sekä suoritettu investointilaskelmat taluttamalla ohjelmoitavasta ja opettamalla ohjelmoitavasta robottisolusta. Koetuloksista nähdään taluttamalla ohjelmoinnin olevan nopeampi ja yksinkertaisempi tapa ohjelmoida robottia kuin opettamalla ohjelmointi. Investointilaskelmien vertailusta nähdään taluttamalla ohjelmoinnin tulevan opettamalla ohjelmointia edullisemmaksi vaihtoehdoksi käyttökustannusten edullisuuden ansiosta.
Resumo:
Electricity distribution network operation (NO) models are challenged as they are expected to continue to undergo changes during the coming decades in the fairly developed and regulated Nordic electricity market. Network asset managers are to adapt to competitive technoeconomical business models regarding the operation of increasingly intelligent distribution networks. Factors driving the changes for new business models within network operation include: increased investments in distributed automation (DA), regulative frameworks for annual profit limits and quality through outage cost, increasing end-customer demands, climatic changes and increasing use of data system tools, such as Distribution Management System (DMS). The doctoral thesis addresses the questions a) whether there exist conditions and qualifications for competitive markets within electricity distribution network operation and b) if so, identification of limitations and required business mechanisms. This doctoral thesis aims to provide an analytical business framework, primarily for electric utilities, for evaluation and development purposes of dedicated network operation models to meet future market dynamics within network operation. In the thesis, the generic build-up of a business model has been addressed through the use of the strategicbusiness hierarchy levels of mission, vision and strategy for definition of the strategic direction of the business followed by the planning, management and process execution levels of enterprisestrategy execution. Research questions within electricity distribution network operation are addressed at the specified hierarchy levels. The results of the research represent interdisciplinary findings in the areas of electrical engineering and production economics. The main scientific contributions include further development of the extended transaction cost economics (TCE) for government decisions within electricity networks and validation of the usability of the methodology for the electricity distribution industry. Moreover, DMS benefit evaluations in the thesis based on the outage cost calculations propose theoretical maximum benefits of DMS applications equalling roughly 25% of the annual outage costs and 10% of the respective operative costs in the case electric utility. Hence, the annual measurable theoretical benefits from the use of DMS applications are considerable. The theoretical results in the thesis are generally validated by surveys and questionnaires.
Resumo:
This study presents the information required to describe the machine and device resources in the turret punch press environment which are needed for the development of the analysing method for automated production. The description of product and device resources and their interconnectedness is the starting point for method comparison the development of expenses, production planning and the performance of optimisation. The manufacturing method cannot be optimized unless the variables and their interdependence are known. Sheet metal parts in particular may then become remarkably complex, and their automatic manufacture may be difficult or, with some automatic equipment, even impossible if not know manufacturing properties. This thesis consists of three main elements, which constitute the triangulation. In the first phase of triangulation, the manufacture occuring on a turret punch press is examined in order to find the factors that affect the efficiency of production. In the second phase of triangulation, the manufacturability of products on turret punch presses is examined through a set of laboratory tests. The third phase oftriangulation involves an examination of five industry parts. The main key findings of this study are: all possible efficiency in high automation level machining cannot be achieved unless the raw materials used in production and the dependencies of the machine and tools are well known. Machine-specific manufacturability factors for turret punch presses were not taken into account in the industrial case samples. On the grounds of the performed tests and industrial case samples, the designer of a sheet metal product can directly influence the machining time, material loss, energy consumption and the number of tools required on a turret punch press by making decisions in the way presented in the hypothesis of thisstudy. The sheet metal parts to be produced can be optimised to bemanufactured on a turret punch press when the material to be used and the kinds of machine and tool options available are known. This provides in-depth knowledge of the machine and tool properties machine and tool-specifically. None of the optimisation starting points described here is a separate entity; instead, they are all connected to each other.