920 resultados para Automation
Resumo:
L’objectiu del present TFM és explorar les possibilitats del programa matemàtic MATLAB i la seva eina Entorn de Disseny d’Interfícies Gràfiques d’Usuari (GUIDE), desenvolupant un programa d’anàlisi d’imatges de provetes metal·logràfiques que es pugui utilitzar per a realitzar pràctiques de laboratori de l’assignatura Tecnologia de Materials de la titulació de Grau en Enginyeria Mecatrònica que s’imparteix a la Universitat de Vic. Les àrees d’interès del treball són la Instrumentació Virtual, la programació MATLAB i les tècniques d’anàlisi d’imatges metal·logràfiques. En la memòria es posa un èmfasi especial en el disseny de la interfície i dels procediments per a efectuar les mesures. El resultat final és un programa que satisfà tots els requeriments que s’havien imposat en la proposta inicial. La interfície del programa és clara i neta, destinant molt espai a la imatge que s’analitza. L’estructura i disposició dels menús i dels comandaments ajuda a que la utilització del programa sigui fàcil i intuïtiva. El programa s’ha estructurat de manera que sigui fàcilment ampliable amb altres rutines de mesura, o amb l’automatització de les rutines existents. Al tractar-se d’un programa que funciona com un instrument de mesura, es dedica un capítol sencer de la memòria a mostrar el procediment de càlcul dels errors que s’ocasionen durant la seva utilització, amb la finalitat de conèixer el seu ordre de magnitud, i de saber-los calcular de nou en cas que variïn les condicions d’utilització. Pel que fa referència a la programació, malgrat que MATLAB no sigui un entorn de programació clàssic, sí que incorpora eines que permeten fer aplicacions no massa complexes, i orientades bàsicament a gràfics o a imatges. L’eina GUIDE simplifica la realització de la interfície d’usuari, malgrat que presenta problemes per tractar dissenys una mica complexos. Per altra banda, el codi generat per GUIDE no és accessible, cosa que no permet modificar manualment la interfície en aquells casos en els que GUIDE té problemes. Malgrat aquests petits problemes, la potència de càlcul de MATLAB compensa sobradament aquestes deficiències.
Resumo:
Els serveis d'obtenció de documents i préstec interbibliotecari constitueixen una peça clau dins de les biblioteques modernes. Moltes de les noves tecnologies han estat decisivas en la dinamització dels seus processes i en la reducció del temps de resposta. El correu electronic ha estat una de les principals innovacions tant pel que fa a la tramesa de les comandes com a la informació que es dona als seus usuaris. En el present article s'analitzen els diferents mitjans de localització de documents des deis tradicionals catálegs en paper o CD-ROM, fins a l'accés en línia. Es descriuen també les diferents possibilitats de recuperació d'aquests documents, en especial totes aquelles noves com la transferencia de fitxers o la descarrega en línia així com els servéis de valr afegit com ara la distribució electrónica de sumaris. Finalment es realitza una petita descripció i comparado dels principals subministradors actuals, entre ells la British Library, INIST, UNCOVER, EBSCODOC, OCLC, KNAW, UMI, ISI, etc.
Resumo:
El objetivo de la reseña es presentar GenIsisWeb, un programa que asiste en la publicación de bases de datos en Internet. Tras una breve descripción técnica y partiendo de una base de datos previamente creada con CDS/ISIS, se muestra el proceso completo para su publicación en el Web.
Resumo:
Las bibliotecas, para poder sobrevivir, han de adaptarse a las demandas de los usuarios, de una manera más flexible, sensible y eficaz. Los nuevos desarrollos tecnológicos ayudan a la biblioteca a adecuarse a sus nuevas necesidades, y es por ello por lo que más frecuentemente se produce el cambio de un sistema automatizado a otro. La implementación de un segundo sistema es un proceso más complejo, debido a las nuevas prestaciones que se incorporan y al tener que migrar los datos ya existentes en el primer sistema. En este artículo se analiza el por qué, el cómo y el cuándo del cambio de sistema, así como las líneas de actuación previas que se han de seguir antes de iniciar el cambio: objetivos, análisis de necesidades, definición de especificaciones técnicas y evaluación. Una vez elegido el producto, se inicia la gestión del cambio que involucra tanto a la dirección para que lo controle, como al personal que actúa como agente del cambio. Por último, se analiza el papel tan importante que juega, en el éxito del nuevo sistema, la participación del personal, su formación y los canales de comunicación que se establezcan en la biblioteca.
Resumo:
In the past decade, a number of trends have come together in the general sphere of computing that have profoundly affected libraries. The popularisation of the Internet, the appearance of open and interoperable systems, the improvements within graphics and multimedia, and the generalised installation of LANs are some of the events of the period. Taken together, the result has been that libraries have undergone an important functional change, representing the switch from simple information depositories to information disseminators. Integrated library management systems have not remained unaffected by this transformation and those that have not adapted to the new technological surroundings are now referred to as legacy systems. The article describes the characteristics of systems existing in today's market and outlines future trends that, according to various authors, include the disappearance of the integrated library management systems that have traditionally been sold.
Resumo:
Soil penetration resistance (PR) and the tensile strength of aggregates (TS) are commonly used to characterize the physical and structural conditions of agricultural soils. This study aimed to assess the functionality of a dynamometry apparatus by linear speed and position control automation of its mobile base to measure PR and TS. The proposed equipment was used for PR measurement in undisturbed samples of a clayey "Nitossolo Vermelho eutroférrico" (Kandiudalfic Eutrudox) under rubber trees sampled in two positions (within and between rows). These samples were also used to measure the volumetric soil water content and bulk density, and determine the soil resistance to penetration curve (SRPC). The TS was measured in a sandy loam "Latossolo Vermelho distrófico" (LVd) - Typic Haplustox - and in a very clayey "Nitossolo Vermelho distroférrico" (NVdf) - Typic Paleudalf - under different uses: LVd under "annual crops" and "native forest", NVdf under "annual crops" and "eucalyptus plantation" (> 30 years old). To measure TS, different strain rates were applied using two dynamometry testing devices: a reference machine (0.03 mm s-1), which has been widely used in other studies, and the proposed equipment (1.55 mm s-1). The determination coefficient values of the SRPC were high (R² > 0.9), regardless of the sampling position. Mean TS values in LVd and NVdf obtained with the proposed equipment did not differ (p > 0.05) from those of the reference testing apparatus, regardless of land use and soil type. Results indicate that PR and TS can be measured faster and accurately by the proposed procedure.
Resumo:
The aim of this study was to evaluate the forensic protocol recently developed by Qiagen for the QIAsymphony automated DNA extraction platform. Samples containing low amounts of DNA were specifically considered, since they represent the majority of samples processed in our laboratory. The analysis of simulated blood and saliva traces showed that the highest DNA yields were obtained with the maximal elution volume available for the forensic protocol, that is 200 ml. Resulting DNA extracts were too diluted for successful DNA profiling and required a concentration. This additional step is time consuming and potentially increases inversion and contamination risks. The 200 ml DNA extracts were concentrated to 25 ml, and the DNA recovery estimated with real-time PCR as well as with the percentage of SGM Plus alleles detected. Results using our manual protocol, based on the QIAamp DNA mini kit, and the automated protocol were comparable. Further tests will be conducted to determine more precisely DNA recovery, contamination risk and PCR inhibitors removal, once a definitive procedure, allowing the concentration of DNA extracts from low yield samples, will be available for the QIAsymphony.
Resumo:
Because of the increase in workplace automation and the diversification of industrial processes, workplaces have become more and more complex. The classical approaches used to address workplace hazard concerns, such as checklists or sequence models, are, therefore, of limited use in such complex systems. Moreover, because of the multifaceted nature of workplaces, the use of single-oriented methods, such as AEA (man oriented), FMEA (system oriented), or HAZOP (process oriented), is not satisfactory. The use of a dynamic modeling approach in order to allow multiple-oriented analyses may constitute an alternative to overcome this limitation. The qualitative modeling aspects of the MORM (man-machine occupational risk modeling) model are discussed in this article. The model, realized on an object-oriented Petri net tool (CO-OPN), has been developed to simulate and analyze industrial processes in an OH&S perspective. The industrial process is modeled as a set of interconnected subnets (state spaces), which describe its constitutive machines. Process-related factors are introduced, in an explicit way, through machine interconnections and flow properties. While man-machine interactions are modeled as triggering events for the state spaces of the machines, the CREAM cognitive behavior model is used in order to establish the relevant triggering events. In the CO-OPN formalism, the model is expressed as a set of interconnected CO-OPN objects defined over data types expressing the measure attached to the flow of entities transiting through the machines. Constraints on the measures assigned to these entities are used to determine the state changes in each machine. Interconnecting machines implies the composition of such flow and consequently the interconnection of the measure constraints. This is reflected by the construction of constraint enrichment hierarchies, which can be used for simulation and analysis optimization in a clear mathematical framework. The use of Petri nets to perform multiple-oriented analysis opens perspectives in the field of industrial risk management. It may significantly reduce the duration of the assessment process. But, most of all, it opens perspectives in the field of risk comparisons and integrated risk management. Moreover, because of the generic nature of the model and tool used, the same concepts and patterns may be used to model a wide range of systems and application fields.
Resumo:
Underbody plows can be very useful tools in winter maintenance, especially when compacted snow or hard ice must be removed from the roadway. By the application of significant down-force, and the use of an appropriate cutting edge angle, compacted snow and ice can be removed very effectively by such plows, with much greater efficiency than any other tool under those circumstances. However, the successful operation of an underbody plow requires considerable skill. If too little down pressure is applied to the plow, then it will not cut the ice or compacted snow. However, if too much force is applied, then either the cutting edge may gouge the road surface, causing significant damage often to both the road surface and the plow, or the plow may ride up on the cutting edge so that it is no longer controllable by the operator. Spinning of the truck in such situations is easily accomplished. Further, excessive down force will result in rapid wear of the cutting edge. Given this need for a high level of operator skill, the operation of an underbody plow is a candidate for automation. In order to successfully automate the operation of an underbody plow, a control system must be developed that follows a set of rules that represent appropriate operation of such a plow. These rules have been developed, based upon earlier work in which operational underbody plows were instrumented to determine the loading upon them (both vertical and horizontal) and the angle at which the blade was operating.These rules have been successfully coded into two different computer programs, both using the MatLab® software. In the first program, various load and angle inputs are analyzed to determine when, whether, and how they violate the rules of operation. This program is essentially deterministic in nature. In the second program, the Simulink® package in the MatLab® software system was used to implement these rules using fuzzy logic. Fuzzy logic essentially replaces a fixed and constant rule with one that varies in such a way as to improve operational control. The development of the fuzzy logic in this simulation was achieved simply by using appropriate routines in the computer software, rather than being developed directly. The results of the computer testing and simulation indicate that a fully automated, computer controlled underbody plow is indeed possible. The issue of whether the next steps toward full automation should be taken (and by whom) has also been considered, and the possibility of some sort of joint venture between a Department of Transportation and a vendor has been suggested.
Resumo:
The goal of this study was to compare the quantity and purity of DNA extracted from biological tracesusing the QIAsymphony robot with that of the manual QIAamp DNA mini kit currently in use in ourlaboratory. We found that the DNA yield of robot was 1.6-3.5 times lower than that of the manualprotocol. This resulted in a loss of 8% and 29% of the alleles correctly scored when analyzing 1/400 and 1/800 diluted saliva samples, respectively. Specific tests showed that the QIAsymphony was at least 2-16times more efficient at removing PCR inhibitors. The higher purity of the DNA may therefore partlycompensate for the lower DNA yield obtained. No case of cross-contamination was observed amongsamples. After purification with the robot, DNA extracts can be automatically transferred in 96-wellsplates, which is an ideal format for subsequent RT-qPCR quantification and DNA amplification. Lesshands-on time and reduced risk of operational errors represent additional advantages of the robotic platform.
Resumo:
The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.
Resumo:
La Gimnàstica Estètica de Grup (GEG) és un esport emergent del qual no existeix gairebé cap treball de camp i/o publicació. En relació al codi de puntuació d’aquesta modalitat, tant les capacitats de salt com la unitat de moviment del cos i la sincronització entre els membres del conjunt, tenen un pes molt important en la puntuació del valor tècnic i de l’execució. En aquest estudi s’ha realitzat la mesura, avaluació i comparació de les manifestacions de la força explosiva, elàstica i reactiva d’un grup de gimnàstica d’estètica d’alt nivell al principi i al final del període competitiu, mitjançant la bateria de tests de salts verticals de Bosco, concretament SJ, CMJ, CMJas i RJ (15” CMJas). També s’ha analitzat la sincronització i/o coordinació temporal intergrupal d’execució de les dificultats tècniques de salt de les coreografies competitives, al llarg del període competitiu d’un conjunt de gimnàstica estètica d’alt nivell, tenint en compte la sincronització en començar la dificultat i en acabar-la. Els resultats obtinguts demostren que la manifestació de força elàsticoexplosiva en CMJ ha disminuït un 0,46 % i la força explosiva SJ (sense reutilització d'energia elàstica ni aprofitament del reflex miotàtic) ha augmentat un 4,63 %. Durant el període competitiu del conjunt sènior de gimnàstica estètica del Club Muntanyenc Sant Cugat, la influència dels braços en la capacitat de salt ha augmentat un 1,32% i la potència anaeròbica alàctica un 4,76%. Tot i que en la majoria de tests, els resultats han estat positius, no es considera que la mostra hagi assolit una millora significativa, atès que no ha superat el 10% proposat en començar l’estudi, i els valors obtinguts són totalment inestables. S’ha vist que en un mateix test el % de pèrdues i de guanys ha estat molt variat, de manera que no es pot establir una relació de millora de la capacitat de salt en funció de l’entrenament. Pel que fa a la sincronització temporal intergrupal, ha millorat entre un 37,50% (sincronització temps inicial) i un 50,00% (sincronització temps final) en relació a les dificultats tècniques. Fet que és relaciona directament amb l’automatització de mecanismes d’execució al llarg de la temporada competitiva. Tot i així no s’ha igualat o superat la millora d’un 70% proposada per les hipòtesis inicials de l’estudi.
Resumo:
In a multicenter study a new, fully automated Roche Diagnostics Elecsys HBsAg II screening assay with improved sensitivity to HBsAg mutant detection was compared to well-established HBsAg tests: AxSYM HBsAg V2 (Abbott), Architect HBsAg (Abbott), Advia Centaur HBsAg (Bayer) Enzygnost HBsAg 5.0 (Dade-Behring), and Vitros Eci HBsAg (Ortho). A total of 16 seroconversion panels, samples of 60 HBsAg native mutants, and 31 HBsAg recombinant mutants, dilution series of NIBSC and PEI standards, 156 HBV positive samples comprising genotypes A to G, 686 preselected HBsAg positive samples from different stages of infection, 3,593 samples from daily routine, and 6,360 unselected blood donations were tested to evaluate the analytical and clinical sensitivity, the detection of mutants, and the specificity of the new assay. Elecsys HBsAg II showed a statistically significant better sensitivity in seroconversion panels to the compared tests. Fifty-seven out of 60 native mutants and all recombinant mutants were found positive. Among 156 HBV samples with different genotypes and 696 preselected HBsAg positive samples Elecsys HBsAg II achieved a sensitivity of 100%. The lower detection limit for NIBSC standard was calculated to be 0.025 IU/ml and for the PEI standards ad and ay it was <0.001 and <0.005 U/ml, respectively. Within 2,724 daily routine specimens and 6.360 unselected blood donations Elecsys HBsAg II showed a specificity of 99.97 and 99.88%, respectively. In conclusion the new Elecsys HBsAg II shows a high sensitivity for the detection of all stages of HBV infection and HBsAg mutants paired together with a high specificity in blood donors, daily routine samples, and potentially interfering sera.
Resumo:
Although sources in general nonlinear mixturm arc not separable iising only statistical independence, a special and realistic case of nonlinear mixtnres, the post nonlinear (PNL) mixture is separable choosing a suited separating system. Then, a natural approach is based on the estimation of tho separating Bystem parameters by minimizing an indcpendence criterion, like estimated mwce mutual information. This class of methods requires higher (than 2) order statistics, and cannot separate Gaarsian sources. However, use of [weak) prior, like source temporal correlation or nonstationarity, leads to other source separation Jgw rithms, which are able to separate Gaussian sourra, and can even, for a few of them, works with second-order statistics. Recently, modeling time correlated s011rces by Markov models, we propose vcry efficient algorithms hmed on minimization of the conditional mutual information. Currently, using the prior of temporally correlated sources, we investigate the fesihility of inverting PNL mixtures with non-bijectiw non-liacarities, like quadratic functions. In this paper, we review the main ICA and BSS results for riunlinear mixtures, present PNL models and algorithms, and finish with advanced resutts using temporally correlated snu~sm
Resumo:
La Universitat de Vic disposa, entre altres equips, d’una cèl·lula flexible de fabricació, del fabricant Festo, que simula un procés de formació de palets amb els productes que es disposen en un magatzem intermedi. Aquesta cèl·lula està composta de quatre estacions de muntatge diferenciades (càrrega de palets, càrrega de plaques, magatzem intermedi i transport). Cada una disposa d'un PLC SIEMENS S7-300 per la seva automatització, i tots aquests es troben interconnectats amb una xarxa industrial Profibus. L'objectiu d'aquest projecte és implantar el sistema SCADA Vijeo Citect pel control i supervisió de l'estació magatzem d'aquesta cèl·lula flexible de fabricació, establint també un intercanvi de dades entre l'SCADA i el Microsoft Access, per poder ser utilitzat per la docència. Aquest projecte s'ha desenvolupat en cinc fases diferents: 1. La primera fase s'ha dedicat a l'automatització pròpiament de l'estació magatzem a partir de l'autòmat programable Siemens S7-300 i complint amb les necessitats plantejades. 2. En la segona fase s'ha programat i establert la comunicació per l'intercanvi de dades (lectura i escriptura) entre el sistema SCADA Vijeo Citect i la base de dades de Microsoft Access. 3. En la tercera fase s'ha elaborat i programat l'entorn gràfic de supervisió i control del procés a partir del sistema SCADA Vijeo Citect. 4. En la quarta fase s'ha instal·lat un OPC Server en el PC i s'ha establert la comunicació entre el PLC i el sistema SCADA. 5. Finalment s'ha anat revisant i depurant les diferents programacions i comunicacions per tal de que el sistema funcioni com a un conjunt.