936 resultados para Process control -- Data processing
Resumo:
Eina que facilita al màxim totes les tasques de gestió tant de les entitats que pertanyen a la coordinadora d'ONG de Lleida com a la pròpia coordinadora.
Resumo:
En aquest projecte, amb la finalitat d'analitzar per mitjà d'un exemple (la base de dades d'una botiga que opera en línia) algunes de les possibilitats que dóna XML, s'ha construït un sistema que permet generar bases de dades relacionals (amb SQL) a partir de diagrames de classes definits en XML. D?aquesta manera es mostra la flexibilitat que aporta XML a la definició, el processament i la transformació dels documents.
Resumo:
El projecte final de carrera objectiu del present document, inclou l'anàlisi, disseny, implementació i documentació d'una aplicació mitjançant tecnologia J2EE la finalitat de la qual és la gestió dels processos o tasques de llarga durada que formen part dels fluxos de negoci de moltes organitzacions.
Resumo:
Estudi seriós sobre les interfícies gràfiques destinades al sector industrial. En aquest sentit, s'analitza el perfil d'usuari o usuaris més freqüent en aquest sector (les seves característiques i les seves necessitats), es presenten i es descriuen diverses pautes de disseny i diversos elements gràfics que compleixen una sèrie de requisits predefinits, es procedeix a fer un muntatge d'exemple presentant una sèrie de pantalles (se n'explica i justifica el funcionament) i, per acabar, es proposa un mètode per a fer la validació del disseny, mètode que pot comportar modificacions sobre el disseny inicial.
Resumo:
This paper shows the impact of the atomic capabilities concept to include control-oriented knowledge of linear control systems in the decisions making structure of physical agents. These agents operate in a real environment managing physical objects (e.g. their physical bodies) in coordinated tasks. This approach is presented using an introspective reasoning approach and control theory based on the specific tasks of passing a ball and executing the offside manoeuvre between physical agents in the robotic soccer testbed. Experimental results and conclusions are presented, emphasising the advantages of our approach that improve the multi-agent performance in cooperative systems
Resumo:
Researchers working in the field of global connectivity analysis using diffusion magnetic resonance imaging (MRI) can count on a wide selection of software packages for processing their data, with methods ranging from the reconstruction of the local intra-voxel axonal structure to the estimation of the trajectories of the underlying fibre tracts. However, each package is generally task-specific and uses its own conventions and file formats. In this article we present the Connectome Mapper, a software pipeline aimed at helping researchers through the tedious process of organising, processing and analysing diffusion MRI data to perform global brain connectivity analyses. Our pipeline is written in Python and is freely available as open-source at www.cmtk.org.
Resumo:
BACKGROUND: Solexa/Illumina short-read ultra-high throughput DNA sequencing technology produces millions of short tags (up to 36 bases) by parallel sequencing-by-synthesis of DNA colonies. The processing and statistical analysis of such high-throughput data poses new challenges; currently a fair proportion of the tags are routinely discarded due to an inability to match them to a reference sequence, thereby reducing the effective throughput of the technology. RESULTS: We propose a novel base calling algorithm using model-based clustering and probability theory to identify ambiguous bases and code them with IUPAC symbols. We also select optimal sub-tags using a score based on information content to remove uncertain bases towards the ends of the reads. CONCLUSION: We show that the method improves genome coverage and number of usable tags as compared with Solexa's data processing pipeline by an average of 15%. An R package is provided which allows fast and accurate base calling of Solexa's fluorescence intensity files and the production of informative diagnostic plots.
Resumo:
A specification for contractor moisture quality control (QC) in roadway embankment construction has been in use for approximately 10 years in Iowa on about 190 projects. The use of this QC specification and the development of the soils certification program for the Iowa Department of Transportation (DOT) originated from Iowa Highway Research Board (IHRB) embankment quality research projects. Since this research, the Iowa DOT has applied compaction with moisture control on most embankment work under pavements. This study set out to independently evaluate the actual quality of compaction using the current specifications. Results show that Proctor tests conducted by Iowa State University (ISU) using representative material obtained from each test section where field testing was conducted had optimum moisture contents and maximum dry densities that are different from what was selected by the Iowa DOT for QC/quality assurance (QA) testing. Comparisons between the measured and selected values showed a standard error of 2.9 lb/ft3 for maximum dry density and 2.1% for optimum moisture content. The difference in optimum moisture content was as high as 4% and the difference in maximum dry density was as high as 6.5 lb/ft3 . The difference at most test locations, however, were within the allowable variation suggested in AASHTO T 99 for test results between different laboratories. The ISU testing results showed higher rates of data outside of the target limits specified based on the available contractor QC data for cohesive materials. Also, during construction observations, wet fill materials were often observed. Several test points indicated that materials were placed and accepted at wet of the target moisture contents. The statistical analysis results indicate that the results obtained from this study showed improvements over results from previous embankment quality research projects (TR-401 Phases I through III and TR-492) in terms of the percentage of data that fell within the specification limits. Although there was evidence of improvement, QC/QA results are not consistently meeting the target limits/values. Recommendations are provided in this report for Iowa DOT consideration with three proposed options for improvements to the current specifications. Option 1 provides enhancements to current specifications in terms of material-dependent control limits, training, sampling, and process control. Option 2 addresses development of alternative specifications that incorporate dynamic cone penetrometer or light weight deflectometer testing into QC/QA. Option 3 addresses incorporating calibrated intelligent compaction measurements into QC/QA.
Resumo:
Työn tavoitteena oli pienentää kuorinnassa syntyviä puuhäviöitä ja parantaa linjan käytettävyyttä nykyaikaisella puunkuorimolla. Tavoitteet toteutettiin automaatiojärjestelmään tehdyillä muutoksilla. Rumpukuorintaprosessi on säilynyt pitkään samanlaisena. Viime vuosina on markkinoille tullut uusia älykkäitä mittalaitteita ja optimointiin kykeneviä säätöjärjestelmiä. Uudenlaisella automaatiotekniikalla on mahdollista mitata kuorintatulosta ja kuorinnassa syntyneitä puuhäviöitä reaaliaikaisesti. Laaduntarkkailun lisäksi tietoja hyödynnetään kuorintaprosessin automaattisessa ohjauksessa. Optimaalisen kuorintatuloksen saavuttaminen edellyttää virheetöntä automaation toimintaa ja tarkkaa viritystä säädöiltä. Työssä perehdyttiin kuorimon perusautomaatioon ja ylemmän tason säädön toimintaan. Prosessinohjauksessa havaitut ongelmakohdat korjattiin. Lopputuloksena saatiin vakiokuorintatehon malli, joka pienentää puuhäviöitä ja parantaa linjaston käytettävyyttä. Saaduilla tuloksilla on merkittävä taloudellinen arvo, ja menetelmät ovat hyödynnettävissä myös toisilla rumpukuorimoilla.
Resumo:
Työn tavoitteena oli laatia euron käyttöönottosuunnitelma. Ensin selvitettiin euron syntyvaiheet ja yhteisen valuutan käsittelysäännöt. Sen jälkeen käydään läpi eräs käyttökelpoinen suunnitelma euron käyttöönottamiseksi, johon kuuluu euron käyttöönoton and speed up the implementation of external selection components. Model and Parametrisation Gallery for Process Integration is specification of an infrastructure and a set of tools which are built for storing process component and model library data into an internet based database from where it can be easily used by simulator users and other process designers. This work consists of two parts. In the literature survey pump selection and the factors that affects selection particularly on process designers point of view are studied. In the applied part a pump selector was created and tested in the test environment which is comparable to Gallery environment. This was done to ease the deployment of the selector to the actual working environment. Process component data of a manufacturer was used on the development of the pump selector. The execution of the selector is based on the hydraulic selection of pumps. Centrifugal pump was used as an example case.
Resumo:
Yritysmaailma elää jatkuvassa muutoksessa. Yrityksen kehittäminen on yrityksen menestymisen kannalta välttämätöntä. Mikäli yritystä ei aktiivisesti kehitetä, kilpailijat valtaavat yrityksen elintilan ja yritys taantuu ja lopulta yrityksen toiminta lakkaa. Tämän työn tarkoituksena oli kehittää Compusteel Oy:n toimintaa ja edistää yhtiön kilpailukykyä lähitulevaisuudessa. Compusteel Oy on luonteeltaan alihankkija / komponenttitoimittaja. Tällaisen yrityksen tärkein kilpailutekijä on tehokas tuotanto. Tässä työssä keskityttiin tutkimaan erityisesti keinoja, millä tuotantoa voidaan tehostaa. Useissa työn lähteenä käytetyissä tutkimuksissa on havaittu tiedon välittämisen merkitys nykyaikaisen yrityksen kilpailutekijänä. Tyypillisimmillään toiminnan virheet johtuvat tiedon puutteellisesta välittämisestä sekä tiedon käytettävyyden heikosta tasosta tarvetilanteessa. Tutkimus toteutettiin kirjallisuuden ja muun lähdeaineiston kuten koulutusten avulla. Tutkimuksessa paneuduttiin erityisesti erilaisten tuotannonohjausmenetelmien kartoittamiseen sekä niiden hyödyntämiseen yrityksen toiminnassa. Työn tuloksena syntyi Compusteel Oy:öön uusi toimintamalli tiettyihin yrityksen ydin prosesseihin. Erityisen tärkeänä nähtiin uuden ajattelumallin omaksuminen tuotannonohjaushenkilöstössä. Uutta ajattelumallia voidaan tulevaisuudessa soveltaa useissa yrityksen kehittämishankkeissa. Lisäksi työn tuloksena käyttöönotettiin uusi toiminnanohjausjärjestelmä, joka tukee uutta toimintamallia.
Resumo:
Recent years have produced great advances in the instrumentation technology. The amount of available data has been increasing due to the simplicity, speed and accuracy of current spectroscopic instruments. Most of these data are, however, meaningless without a proper analysis. This has been one of the reasons for the overgrowing success of multivariate handling of such data. Industrial data is commonly not designed data; in other words, there is no exact experimental design, but rather the data have been collected as a routine procedure during an industrial process. This makes certain demands on the multivariate modeling, as the selection of samples and variables can have an enormous effect. Common approaches in the modeling of industrial data are PCA (principal component analysis) and PLS (projection to latent structures or partial least squares) but there are also other methods that should be considered. The more advanced methods include multi block modeling and nonlinear modeling. In this thesis it is shown that the results of data analysis vary according to the modeling approach used, thus making the selection of the modeling approach dependent on the purpose of the model. If the model is intended to provide accurate predictions, the approach should be different than in the case where the purpose of modeling is mostly to obtain information about the variables and the process. For industrial applicability it is essential that the methods are robust and sufficiently simple to apply. In this way the methods and the results can be compared and an approach selected that is suitable for the intended purpose. Differences in data analysis methods are compared with data from different fields of industry in this thesis. In the first two papers, the multi block method is considered for data originating from the oil and fertilizer industries. The results are compared to those from PLS and priority PLS. The third paper considers applicability of multivariate models to process control for a reactive crystallization process. In the fourth paper, nonlinear modeling is examined with a data set from the oil industry. The response has a nonlinear relation to the descriptor matrix, and the results are compared between linear modeling, polynomial PLS and nonlinear modeling using nonlinear score vectors.
Resumo:
The process of cold storage chambers contributes largely to the quality and longevity of stored products. In recent years, it has been intensified the study of control strategies in order to decrease the temperature change inside the storage chamber and to reduce the electric power consumption. This study has developed a system for data acquisition and process control, in LabVIEW language, to be applied in the cooling system of a refrigerating chamber of 30m³. The use of instrumentation and the application developed fostered the development of scientific experiments, which aimed to study the dynamic behavior of the refrigeration system, compare the performance of control strategies and the heat engine, even due to the controlled temperature, or to the electricity consumption. This system tested the strategies for on-off control, PID and fuzzy. Regarding power consumption, the fuzzy controller showed the best result, saving 10% when compared with other tested strategies.
Resumo:
The thesis is related to the topic of image-based characterization of fibers in pulp suspension during the papermaking process. Papermaking industry is focusing on process control optimization and automatization, which makes it possible to manufacture highquality products in a resource-efficient way. Being a part of the process control, pulp suspension analysis allows to predict and modify properties of the end product. This work is a part of the tree species identification task and focuses on analysis of fiber parameters in the pulp suspension at the wet stage of paper production. The existing machine vision methods for pulp characterization were investigated, and a method exploiting direction sensitive filtering, non-maximum suppression, hysteresis thresholding, tensor voting, and curve extraction from tensor maps was developed. Application of the method to the microscopic grayscale pulp images made it possible to detect curves corresponding to fibers in the pulp image and to compute their morphological characteristics. Performance of the method was evaluated based on the manually produced ground truth data. An accuracy of fiber characteristics estimation, including length, width, and curvature, for the acacia pulp images was found to be 84, 85, and 60% correspondingly.
Resumo:
The papermaking industry has been continuously developing intelligent solutions to characterize the raw materials it uses, to control the manufacturing process in a robust way, and to guarantee the desired quality of the end product. Based on the much improved imaging techniques and image-based analysis methods, it has become possible to look inside the manufacturing pipeline and propose more effective alternatives to human expertise. This study is focused on the development of image analyses methods for the pulping process of papermaking. Pulping starts with wood disintegration and forming the fiber suspension that is subsequently bleached, mixed with additives and chemicals, and finally dried and shipped to the papermaking mills. At each stage of the process it is important to analyze the properties of the raw material to guarantee the product quality. In order to evaluate properties of fibers, the main component of the pulp suspension, a framework for fiber characterization based on microscopic images is proposed in this thesis as the first contribution. The framework allows computation of fiber length and curl index correlating well with the ground truth values. The bubble detection method, the second contribution, was developed in order to estimate the gas volume at the delignification stage of the pulping process based on high-resolution in-line imaging. The gas volume was estimated accurately and the solution enabled just-in-time process termination whereas the accurate estimation of bubble size categories still remained challenging. As the third contribution of the study, optical flow computation was studied and the methods were successfully applied to pulp flow velocity estimation based on double-exposed images. Finally, a framework for classifying dirt particles in dried pulp sheets, including the semisynthetic ground truth generation, feature selection, and performance comparison of the state-of-the-art classification techniques, was proposed as the fourth contribution. The framework was successfully tested on the semisynthetic and real-world pulp sheet images. These four contributions assist in developing an integrated factory-level vision-based process control.