914 resultados para PLC and SCADA programming
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
Possibilistic Defeasible Logic Programming (P-DeLP) is a logic programming language which combines features from argumentation theory and logic programming, incorporating the treatment of possibilistic uncertainty at the object-language level. In spite of its expressive power, an important limitation in P-DeLP is that imprecise, fuzzy information cannot be expressed in the object language. One interesting alternative for solving this limitation is the use of PGL+, a possibilistic logic over Gödel logic extended with fuzzy constants. Fuzzy constants in PGL+ allow expressing disjunctive information about the unknown value of a variable, in the sense of a magnitude, modelled as a (unary) predicate. The aim of this article is twofold: firstly, we formalize DePGL+, a possibilistic defeasible logic programming language that extends P-DeLP through the use of PGL+ in order to incorporate fuzzy constants and a fuzzy unification mechanism for them. Secondly, we propose a way to handle conflicting arguments in the context of the extended framework.
Resumo:
Työn tavoitteena oli arvioida ja selvittää toimittajasuhteeseen vaikuttavia tekijöitä JOT Automation Group Oyj.n ja sen alihankkijoiden välisessä yhteistyössä ja muodostaa yrityksen kilpailukykyä parantava toimittaja-arviointiprosessi. Työssä keskityttiin tarkastelemaan yleisillä materiaali- ja komponenttimarkkinoilla toimivia toimittajia elektroniikkateollisuuden tuotantojärjestelmien valmistuksessa. Ensin tutustuttiin toimittajasuhdetta ja sen arviointia käsitelleeseen kirjallisuuteen. Teorian tueksi tehtiin haastatteluja ja kartoitettiin ensisijaisia tarpeita ja tavoitteita arviointiprosessille. Valmis prosessi testattiin käytännössä kahden eri case-esimerkin avulla. Prosessista muodostui kahteen eri työkaluun jakautunut kokonaisuus, joista auditointi arvioi toimittajan kyvykkyyttä vastatta sille asetettuihin vaatimuksiin. Toimittajan suorituskyvyn mittaaminen puolestaan testaa ja vertaa jatkuvasti toiminnan todellista tasoa auditoinnissa saatuihin tuloksiin. Työ sisältää selvityksen ja ohjeistuksen toimittaja-arviointiprosessin käytöstä. Prosessin käyttö alentaa toimittajaan kohdistuvaa materiaalien saatavuuteen ja hankintaan liittyviä riskejä. Esimerkeistä saadut kokemukset osoittivat, että prosessin avulla päästään pureutumaan tärkeisiin ydinalueisiin ja kehittämään niitä sekä toimittajalle, että ostajayritykselle edullisella tavalla. Toimittaja-arviointiprosessista kehittyy toimintatapa yrityksen ja sen toimittajan välisen suhteen ylläpitämiseksi.
Resumo:
This thesis presents a software that allows data acquisition production process, in this case, an automatic pallet nailing line. The recording of these data will enable them to make a track and analyze them later, either with the analytical tools of the application or by the transfer of such data to an Excel sheet or database. The programming language has been developed made by Ladder for the application in the PLC that controls the line of nailing. Control pages for the HMI application that monitors the process. Finally, the Visual Basic language for the production department computer application. To extract production variables from the process, the developed software communicates with the network formed by the PLC and the HMI terminal which stores and control the process using the Modbus TCP/IP protocol.
Resumo:
L’Escola Politècnica Superior de la Universitat de Vic disposa d’una cèl·lula de fabricació didàctica de la marca FESTO que simula un procés d’assemblatge d’una comanda. Aquesta cèl·lula esta composta per quatre estacions diferenciades que poden treballar de forma independent o de forma conjunta, l’estació palets, l’estació plaques, l’estació cinta i l’estació magatzem. Cada estació és un conjunt de sensors i actuadors controlats per mitjà d’un PLC, aquests estan interconnectats a través d’un bus industrial. L’objectiu d’aquest treball consisteix en realitzar la substitució dels PLC’s, decidir el funcionament que han de tenir les estacions, instal·lar una pantalla tàctil pel control del procés i realitzar la programació de tots els elements. Aquest projecte ha estat realitzat en cinc fases principals: 1. Estudi i coneixement de les estacions, en aquesta fase s’ha estudiat els diferents sensors i actuadors que les conformen, així com el funcionament d’aquestes amb el programa i PLC’s antics. 2. Instal·lació i cablejat dels nous PLC’s i de la pantalla tàctil. 3. Estudi sobre el nou funcionament que han de seguir les estacions. 4. Programació dels nous dispositius seguint el funcionament acordat. 5. Posada en marxa del sistema i realització de proves. 6. Realització de la memòria del projecte, on s’expliquen les característiques i el funcionament de totes les estacions i de la pantalla tàctil. La conclusió que s’ha extret d’aquest treball és que l’automatització d’un procés de fabricació tot i que suposa un esforç inicial a nivell de recursos, un cop realitzada la instal·lació suposa una millora de l’eficiència del sistema. És per això que la indústria cada cop més tendeix a automatitzar els seus processos, no només per millorar la competitivitat, sinó també per realitzar tasques que les persones no poden executar de forma eficient o segura.
Resumo:
Software integration is a stage in a software development process to assemble separate components to produce a single product. It is important to manage the risks involved and being able to integrate smoothly, because software cannot be released without integrating it first. Furthermore, it has been shown that the integration and testing phase can make up 40 % of the overall project costs. These issues can be mitigated by using a software engineering practice called continuous integration. This thesis work presents how continuous integration is introduced to the author's employer organisation. This includes studying how the continuous integration process works and creating the technical basis to start using the process on future projects. The implemented system supports software written in C and C++ programming languages on Linux platform, but the general concepts can be applied to any programming language and platform by selecting the appropriate tools. The results demonstrate in detail what issues need to be solved when the process is acquired in a corporate environment. Additionally, they provide an implementation and process description suitable to the organisation. The results show that continuous integration can reduce the risks involved in a software process and increase the quality of the product as well.
Resumo:
Agile coaching of a project team is one way to aid learning of the agile methods. The objective of this thesis is to present the agile coaching plan and to follow how complying the plan affects to the project teams. Furthermore, the agile methods are followed how they work in the projects. Two projects are used to help the research. From the thesis point of view, the task for the first project is to coach the project team and two new coaches. The task for the second project is also to coach the project team, but this time so that one of the new coaches acts as the coach. The agile methods Scrum process and Extreme programming are utilized by the projects. In the latter, the test driven development, continuous integration and pair programming are concentrated more precisely. The results of the work are based on the observations from the projects and the analysis derived from the observations. The results are divided to the effects of the coaching and to functionality of the agile methods in the projects. Because of the small sample set, the results are directional. The presented plan, to coach the agile methods, needs developing, but the results of the functionality of the agile methods are encouraging.
Resumo:
With the growth in new technologies, using online tools have become an everyday lifestyle. It has a greater impact on researchers as the data obtained from various experiments needs to be analyzed and knowledge of programming has become mandatory even for pure biologists. Hence, VTT came up with a new tool, R Executables (REX) which is a web application designed to provide a graphical interface for biological data functions like Image analysis, Gene expression data analysis, plotting, disease and control studies etc., which employs R functions to provide results. REX provides a user interactive application for the biologists to directly enter the values and run the required analysis with a single click. The program processes the given data in the background and prints results rapidly. Due to growth of data and load on server, the interface has gained problems concerning time consumption, poor GUI, data storage issues, security, minimal user interactive experience and crashes with large amount of data. This thesis handles the methods by which these problems were resolved and made REX a better application for the future. The old REX was developed using Python Django and now, a new programming language, Vaadin has been implemented. Vaadin is a Java framework for developing web applications and the programming language is extremely similar to Java with new rich components. Vaadin provides better security, better speed, good and interactive interface. In this thesis, subset functionalities of REX was selected which includes IST bulk plotting and image segmentation and implemented those using Vaadin. A code of 662 lines was programmed by me which included Vaadin as the front-end handler while R language was used for back-end data retrieval, computing and plotting. The application is optimized to allow further functionalities to be migrated with ease from old REX. Future development is focused on including Hight throughput screening functions along with gene expression database handling
Resumo:
Milk supply from Mexican dairy farms does not meet demand and small-scale farms can contribute toward closing the gap. Two multi-criteria programming techniques, goal programming and compromise programming, were used in a study of small-scale dairy farms in central Mexico. To build the goal and compromise programming models, 4 ordinary linear programming models were also developed, which had objective functions to maximize metabolizable energy for milk production, to maximize margin of income over feed costs, to maximize metabolizable protein for milk production, and to minimize purchased feedstuffs. Neither multicriteria approach was significantly better than the other; however, by applying both models it was possible to perform a more comprehensive analysis of these small-scale dairy systems. The multi-criteria programming models affirm findings from previous work and suggest that a forage strategy based on alfalfa, rye-grass, and corn silage would meet nutrient requirements of the herd. Both models suggested that there is an economic advantage in rescheduling the calving season to the second and third calendar quarters to better synchronize higher demand for nutrients with the period of high forage availability.
Resumo:
In 1989, the computer programming language POP-11 is 21 years old. This book looks at the reasons behind its invention, and traces its rise from an experimental language to a major AI language, playing a major part in many innovating projects. There is a chapter on the inventor of the language, Robin Popplestone, and a discussion of the applications of POP-11 in a variety of areas. The efficiency of AI programming is covered, along with a comparison between POP-11 and other programming languages. The book concludes by reviewing the standardization of POP-11 into POP91.
Resumo:
There have been various techniques published for optimizing the net present value of tenders by use of discounted cash flow theory and linear programming. These approaches to tendering appear to have been largely ignored by the industry. This paper utilises six case studies of tendering practice in order to establish the reasons for this apparent disregard. Tendering is demonstrated to be a market orientated function with many subjective judgements being made regarding a firm's environment. Detailed consideration of 'internal' factors such as cash flow are therefore judged to be unjustified. Systems theory is then drawn upon and applied to the separate processes of estimating and tendering. Estimating is seen as taking place in a relatively sheltered environment and as such operates as a relatively closed system. Tendering, however, takes place in a changing and dynamic environment and as such must operate as a relatively open system. The use of sophisticated methods to optimize the value of tenders is then identified as being dependent upon the assumption of rationality, which is justified in the case of a relatively closed system (i.e. estimating), but not for a relatively open system (i.e. tendering).
Resumo:
During a petroleum well production process, It is common the slmultaneous oil and water production, in proportion that can vary from 0% up to values close to 100% of water. Moreover, the production flows can vary a lot, depending on the charaeteristies of eaeh reservoir. Thus being, the meters used in field for the flow and BSW (water in the oil) measurement must work well in wide bands of operation. For the evaluation of the operation of these meters, in the different operation conditions, a Laboratory will be built in UFRN, that has for objective to evaluate in an automatic way the processes of flow and BSW petroleum measurement, considering different operation conditions. The good acting of these meters is fundamental for the accuracy of the measures of the volumes of production liquid and rude of petroleum. For the measurement of this production, the petroleum companies use meters that should indicate the values with tha largast possible accuracy and to respect a series of conditions and minimum requirements, estabelished by the united Entrance ANP/INMETRO 19106/2000. The laboratory of Evafuation of the Processes of Measurement of Flow and BSW to be built will possess an oil tank basically, a tank of water, besides a mixer, a tank auditor, a tank for separation and a tank of residues for discard of fluids, fundamental for the evaluation of the flow metars and BSW. The whole process will be automated through the use of a Programmable Logicat Controller (CLP) and of a supervisory system.This laboratory besides allowing the evaluation of flow meters and BSW used by petroleum companies, it will make possible the development of researches related to the automation. Besides, it will be a collaborating element to the development of the Computer Engineering and Automation Department, that it will propitiate the evolution of the faculty and discente, qualifying them for a job market in continuous growth. The present work describes the project of automation of the laboratory that will be built at of UFRN. The system will be automated using a Programmable Logical Controller and a supervisory system. The programming of PLC and the screens of the supervisory system were developed in this work
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Variational inequalities and related problems may be solved via smooth bound constrained optimization. A comprehensive discussion of the important features involved with this strategy is presented. Complementarity problems and mathematical programming problems with equilibrium constraints are included in this report. Numerical experiments are commented. Conclusions and directions of future research are indicated.
Resumo:
IEEE 1451 Standard is intended to address the smart transducer interfacing problematic in network environments. Usually, proprietary hardware and software is a very efficient solution to in planent the IEEE 1451 normative, although can be expensive and inflexible. In contrast, the use of open and standardized tools for implementing the IEEE 1451 normative is proposed in this paper. Tools such as Java and Phyton programming languages, Linux, programmable logic technology, Personal Computer resources and Ethernet architecture were integrated in order to constructa network node based on the IEEE 1451 standards. The node can be applied in systems based on the client-server communication model The evaluation of the employed tools and expermental results are presented. © 2005 IEEE.