983 resultados para Packing for shipment -- Automation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sahalaitoksilla käytetään lämpöenergiaa sahatavaran kuivauksessa. Lämpö tuotetaan pääasiassa polttamalla sivutuotteita, kuten kuorta ja purua. Hyvän kuivauslaadun saavuttamiseksi kuivausilman lämpötilan ja kosteuden on oltava oikean suuruiset. Epäsuotuisat kuivausolosuhteet hidastavat kuivumista tai aiheuttavat kuivausvikoja. Työssä käsiteltävänä sahalla lämpö tuotetaan kahdella sivutuotteita polttavalla kuorikattilalla sekä tarvittaessa raskasöljykattilalla. Raskasöljykattilalla tuotetaan myös kuivaamojen kostutushöyry. Automatiikkaa kehittämällä lämmöntuotanto saatiin reagoimaan paremmin vaihtelevaan lämmöntarpeeseen. Hyvin toimivan lämmöntuo¬tantojärjestelmän ansiosta kuivausolosuhteet ovat säilyneet entistä vakaampina. Matalapaineista höyryä johdetaan osaan sahan kuivaamoista. Höyrytys parantaa kuivauslaatua ja mahdollistaa nopeamman kuivauksen. Kuivaamot höyrytetään kuormanvaihdon jälkeen, jotta ilmankosteus nousee nopeasti tavoitetasolle. Lämmitys-vaiheen jälkeen höyrykostutusta ei tulisi käyttää ennen seuraavaa kuormanvaihtoa. Kehittämällä kuivaamoautomatiikkaa höyryn käyttö saatiin aikaisempaa kustannustehokkaammaksi. Muutoksien avulla polttoainekustannukset ovat pienentyneet.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mertaniemen voimalaitoksien prosessitietokone (PTK) on uusittu keväällä 2005. Tämän työn tarkoituksena on ollut auttaa PTK:n virheiden korjaamisessa ja puut-teiden kartoittamisessa. Työssä on keskitytty etenkin prosessiraportoinnin tekemiseen. Työn alussa on kerrottu Mertaniemen voimalaitoksen tekniset tiedot ja PTK:n hankinnan taustatietoja. Uudesta PTK-järjestelmästä on kuvattu laitteisto, sovellus ja perusohjelmistot. PTK:n ja muiden järjestelmien välinen tiedonsiirto on myös kuvattu. PTK muuttujien nimeäminen on esitelty, jotta olisi helpompi hahmottaa työssä käytettyjen positioiden merkityksiä. Prosessiraportoinnin kehittämisessä kuvataan raporttien tarvetta ja niiden sisältöä sekä sitä kuinka raportit on tehty. Päästöraportointi on esitetty omana osa-alueenaan, koska voimalaitosten päästöjen seurantaa edellytetään tehtävän viran¬omaismääräysten ja EU-direktiivien vaatimusten mukaisesti. Raporttien lisäksi prosessiarvojen seuraamista helpottamaan on tehty yhteisiä trendi- ja työtilanäyttöjä. PTK:n ongelmakohtina on käsitelty muuttujien tunnuksissa ja nimissä olevat virheet sekä PTK laskennan tarkastaminen. Muuttujien nimien ja laskennan tarkas¬tusta tehtiin prosessiraportoinnin tekemisen yhteydessä sekä yhteistyössä PTK-järjestelmän toimittaneen Metso Automation Oy:n kanssa. Päästölaskennan korjaaminen oli erityisen tärkeää.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Elecció d'un ERP en base a l'estudi de processos òptims per a la indústria càrnia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Työssä kartoitettiin eri kierteenvalmistusmenetelmät ja valittiin sopivin vaihtoehto harjateräspulttien kierteiden valmistukseen. Työssä painotettiin etenkin läpimenoaikaa ja työkalukustannuksia. Valitun menetelmän pohjalta suunniteltiin kolme eri automaatiotason tuotantosolua tuotteiden valmistukseen Suomessa ja Slovakiassa sekä laskettiin niiden investointikustannukset ja – hyödyt. Tuotannonsuunnittelussa otettiin huomioon myös muut pultin valmistusvaiheet kierteenvalmistuksen lisäksi sekä selvitettiin nykyisten prosessiparametrien tehokkuus. Työssä löydettiin nykyistä nopeampi ja työkalukustannuksiltaan halvempi menetelmä kierteiden valmistukseen. Myös nykyisiä prosessiparametreja saatiin nopeutettua. Työn pohjalta valittu tuotantosolu päätettiin toteuttaa käytännön testien onnistuessa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This master’s thesis handles an operating model for an electric equipment supplier conducted sale oriented energy audit for pumping, fan and other motor applications at power plants. The study goes through the largest factors affecting internal electricity use at a power plant, finds an energy audit –like approach for the basis of information gathering and presents the information needed for conducting the analysis. The model is tested in practice at a kraft recovery boiler of a chemical pulping mill. Targets chosen represent some of the largest electric motor applications in the boiler itself and in its fuel handling. The energy saving potential of the chosen targets is calculated by simulating the energy consumption of the alternatives for controlling the targets, and thereafter combining the information with the volume flow duration curve. Results of the research are somewhat divaricated, as all the information needed is not available in the automation system. Some of the targets could be simulated and their energy saving potential calculated quite easily. At some of the targets chosen the monitoring was not sufficient enough for this and additional measurements would have been needed to base the calculations on. In traditional energy audits, energy efficiency of pump and fan applications is not necessarily examined. This means that there are good possibilities for developing the now presented targeted energy audit procedure basis further.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este proyecto pretende implementar una solución de domótica reutilizando equipos de bajo coste disponibles en una vivienda que son gestionados de forma independiente y carecen de protocolos de comunicaciones estandarizados para su interconexión con otros entornos.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present work describes the development of a fast and robust analytical method for the determination of 53 antibiotic residues, covering various chemical groups and some of their metabolites, in environmental matrices that are considered important sources of antibiotic pollution, namely hospital and urban wastewaters, as well as in river waters. The method is based on automated off-line solid phase extraction (SPE) followed by ultra-high-performance liquid chromatography coupled to quadrupole linear ion trap tandem mass spectrometry (UHPLC–QqLIT). For unequivocal identification and confirmation, and in order to fulfill EU guidelines, two selected reaction monitoring (SRM) transitions per compound are monitored (the most intense one is used for quantification and the second one for confirmation). Quantification of target antibiotics is performed by the internal standard approach, using one isotopically labeled compound for each chemical group, in order to correct matrix effects. The main advantages of the method are automation and speed-up of sample preparation, by the reduction of extraction volumes for all matrices, the fast separation of a wide spectrum of antibiotics by using ultra-high-performance liquid chromatography, its sensitivity (limits of detection in the low ng/L range) and selectivity (due to the use of tandem mass spectrometry) The inclusion of β-lactam antibiotics (penicillins and cephalosporins), which are compounds difficult to analyze in multi-residue methods due to their instability in water matrices, and some antibiotics metabolites are other important benefits of the method developed. As part of the validation procedure, the method developed was applied to the analysis of antibiotics residues in hospital, urban influent and effluent wastewaters as well as in river water samples

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rapid manufacturing is an advanced manufacturing technology based on layer-by-layer manufacturing to produce a part. This paper presents experimental work carried out to investigate the effects of scan speed, layer thickness, and building direction on the following part features: dimensional error, surface roughness, and mechanical properties for DMLS with DS H20 powder and SLM with CL 20 powder (1.4404/AISI 316L). Findings were evaluated using ANOVA analysis. According to the experimental results, build direction has a significant effect on part quality, in terms of dimensional error and surface roughness. For the SLM process, the build direction has no influence on mechanical properties. Results of this research support industry estimating part quality and mechanical properties before the production of parts with additive manufacturing, using iron-based powders

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L’Escola Politècnica Superior de la Universitat de Vic disposa d’una cèl·lula de fabricació flexible del fabricant Festo, que simula un procés d’emmagatzematge automàtic, aquesta cèl·lula esta composta per quatre estacions de muntatge diferenciades i independents, l’estació palets, l’estació plaques, l’estació magatzem intermedi i l’estació transport. Cada una d’aquestes estacions està formada per sensors i actuadors elèctrics i pneumàtics del fabricant Festo que van connectats a un PLC SIEMENS S7-300.Els quatre PLC’s (un per cada estació) estan connectats entre ells mitjançant el bus de comunicacions industrials Profibus. L’objectiu d’aquest treball consisteix en l’adaptació de la programació dels PLC’s i la realització d’un SCADA per tal de controlar el funcionament del conjunt de la cèl·lula de fabricació a través del software Vijeo Citect, d’aquesta manera es coneixerà el funcionament de la cèl·lula i permetrà treure’n rendiment per la docència. Aquest projecte ha estat realitzat en quatre fases principals. 1. Estudi i coneixement de les estacions, en aquesta fase s’han estudiat els manuals de funcionament de les estacions i s’han interpretat els codis de programació dels seus PLCs, amb l’objectiu de conèixer bé el programa per tal de interaccionar-hi més endavant amb el sistema SCADA 2. Disseny i programació del sistema SCADA, en aquesta fase s’ha realitzat tot el disseny gràfic de les pantalles de la interfície SCADA així com la programació dels objectes, la connexió amb els PLCs i la base de dades. 3. Posada en marxa del sistema complert, quan es coneixia abastament el funcionament de les estacions i el sistema SCADA estava completat s’ha fet la posada en marxa del conjunt i s’ha comprovat el correcte funcionament i interacció dels sistemes. 4. Realització de la memòria del projecte, en aquesta ultima fase s’ha realitzat la memòria del projecte on s’expliquen les característiques i funcionament de totes les estacions i del sistema SCADA. La conclusió més rellevant obtinguda en aquest treball, és la clara visualització de la potència i simplicitat que han aportat els sistemes SCADA al món de l’automatització, anys enrere per la supervisió de l’estat d’un sistema automatitzat era necessari disposar d’un gran espai amb grans panells de control formats per una gran quantitat de pilots lluminosos, potenciòmetres, interruptors, polsadors, displays i sobretot un voluminós i complexa cablejat, gràcies als sistemes SCADA avui en dia tot això pot quedar reduït a un PC o terminal tàctil, amb pantalles gràfiques clares i una gran quantitat d’opcions de supervisió control i configuració del sistema automatitzat.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Some recent studies have characterized the stability of blood variables commonly measured for the Athlete Biological Passport. The aim of this study was to characterize the impact of different shipments conditions and the quality of the results returned by the haematological analyzer. Twenty-two healthy male subjects provided five EDTA tubes each. Four shipment conditions (24, 36, 48, 72 h) under refrigerated conditions were tested and compared to a set of samples left in the laboratory also under refrigerated conditions (group control). All measurements were conducted using two Sysmex XT-2000i analyzers. Haemoglobin concentration, reticulocytes percentage, and OFF-score numerical data were the same for samples analyzed just after collection and after a shipment under refrigerated conditions up to 72 h. Detailed information reported especially by the differential (DIFF) channel scatterplot of the Sysmex XT-2000i indicated that there were signs of blood deterioration, but were not of relevance for the variables used in the Athlete Biological Passport. As long as the cold chain is guaranteed, the time delay between the collection and the analyses of blood variables can be extended. Copyright© 2015 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent standardization efforts in e-learning technology have resulted in a number of specifications, however, the automation process that is considered essential in a learning management system (LMS) is a lessexplored one. As learning technology becomes more widespread and more heterogeneous, there is a growing need to specify processes that cross the boundaries of a single LMS or learning resource repository. This article proposes to obtain a specification orientated to automation that takes on board the heterogeneity of systems and formats and provides a language for specifying complex and generic interactions. Having this goal in mind, a technique based on three steps is suggested. The semantic conformance profiles, the business process management (BPM) diagram, and its translation into the business process execution language (BPEL) seem to be suitable for achieving it.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Automation or semi-automation of learning scenariospecifications is one of the least exploredsubjects in the e-learning research area. There isa need for a catalogue of learning scenarios and atechnique to facilitate automated retrieval of stored specifications. This requires constructing anontology with this goal and is justified inthis paper. This ontology must mainlysupport a specification technique for learning scenarios. This ontology should also be useful in the creation and validation of new scenarios as well as in the personalization of learning scenarios or their monitoring. Thus, after justifying the need for this ontology, a first approach of a possible knowledge domain is presented. An example of a concrete learning scenario illustrates some relevant concepts supported by this ontology in order to define the scenario in such a way that it could be easy to automate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The strength properties of paper coating layer are very important in converting and printing operations. Too great or low strength of the coating can affect several problems in printing. One of the problems caused by the strength of coating is the cracking at the fold. After printing the paper is folded to final form and the pages are stapled together. In folding the paper coating can crack causing aesthetic damage over printed image or in the worst case the centre sheet can fall off in stapling. When folding the paper other side undergoes tensile stresses and the other side compressive stresses. If the difference between these stresses is too high, the coating can crack on the folding. To better predict and prevent cracking at the fold it is good to know the strength properties of coating layer. It has measured earlier the tensile strength of coating layer but not the compressive strength. In this study it was tried to find some way to measure the compressive strength of the coating layer and investigate how different coatings behave in compression. It was used the short span crush test, which is used to measure the in-plane compressive strength of paperboards, to measure the compressive strength of the coating layer. In this method the free span of the specimen is very small which prevent buckling. It was measured the compressive strength of free coating films as well as coated paper. It was also measured the tensile strength and the Bendtsen air permeance of the coating film. The results showed that the shape of pigment has a great effect to the strength of coating. Platy pigment gave much better strength than round or needle-like pigment. On the other hand calcined kaolin, which is also platy but the particles are aggregated, decreased the strength substantially. The difference in the strength can be explained with packing of the particles which is affecting to the porosity and thus to the strength. The platy kaolin packs up much better than others and creates less porous structure. The results also showed that the binder properties have a great effect to the compressive strength of coating layer. The amount of latex and the glass transition temperature, Tg, affect to the strength. As the amount of latex is increasing, the strength of coating is increasing also. Larger amount of latex is binding the pigment particles better together and decreasing the porosity. Compressive strength was increasing when the Tg was increasing because the hard latex gives a stiffer and less elastic film than soft latex.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A simple and inexpensive device to automate a water distilling apparatus is shown. It is composed by a magnetic floater placed in the water reservoir and a level control unit, which acts over the heating element circuit. In order to provide water saving, an electromagnetic valve is inserted in the water supply inlet. Some suggestions for the adaptation to other types of equipment are also offered.