898 resultados para desig automation of robots


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis describes the work carried out to develop a prototype knowledge-based system 'KBS-SETUPP' to generate process plans for the manufacture of seamless tubes. The work is specifically related to a plant in which hollows are made from solid billets using a rotary piercing process and then reduced to required size and finished properties using the fixed plug cold drawing process. The thesis first discusses various methods of tube production in order to give a general background of tube manufacture. Then a review of the automation of the process planning function is presented in terms of its basic sub-tasks and the techniques and suitability of a knowledge-based system is established. In the light of such a review and a case study, the process planning problem is formulated in the domain of seamless tube manufacture, its basic sub-tasks are identified and capabilities and constraints of the available equipment in the specific plant are established. The task of collecting and collating the process planning knowledge in seamless tube manufacture is discussed and is mostly fulfilled from domain experts, analysing of existing manufacturing records specific to plant, textbooks and applicable Standards. For the cold drawing mill, tube-drawing schedules have been rationalised to correspond with practice. The validation of such schedules has been achieved by computing the process parameters and then comparing these with the drawbench capacity to avoid over-loading. The existing models cannot be simulated in the computer program as such, therefore a mathematical model has been proposed which estimates the process parameters which are in close agreement with experimental values established by other researchers. To implement the concepts, a Knowledge-Based System 'KBS- SETUPP' has been developed on Personal Computer using Turbo- Prolog. The system is capable of generating process plans, production schedules and some additional capabilities to supplement process planning. The system generated process plans have been compared with the actual plans of the company and it has been shown that the results are satisfactory and encouraging and that the system has the capabilities which are useful.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trust is a critical component of successful e-Commerce. Given the impersonality, anonymity, and automation of transactions, online vendor trustworthiness cannot be assessed by means of body language and other environmental cues that consumers typically use when deciding to trust offline retailers. It is therefore essential that the design of e-Commerce websites compensate by incorporating circumstantial cues in the form of appropriate trust triggers. This paper presents and discusses the results of a study which took an initial look at whether consumers with different personality types (a) are generally more trusting and (b) rely on different trust cues during their assessment of first impression vendor trustworthiness in B2C e-Commerce.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper has been presented at the 12th International Conference on Applications of Computer Algebra, Varna, Bulgaria, June, 2006.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INFRAWEBS project [INFRAWEBS] considers usage of semantics for the complete lifecycle of Semantic Web processes, which represent complex interactions between Semantic Web Services. One of the main initiatives in the Semantic Web is WSMO framework, aiming at describing the various aspects related to Semantic Web Services in order to enable the automation of Web Service discovery, composition, interoperation and invocation. In the paper the conceptual architecture for BPEL-based INFRAWEBS editor is proposed that is intended to construct a part of WSMO descriptions of the Semantic Web Services. The semantic description of Web Services has to cover Data, Functional, Execution and QoS semantics. The representation of Functional semantics can be achieved by adding the service functionality to the process description. The architecture relies on a functional (operational) semantics of the Business Process Execution Language for Web Services (BPEL4WS) and uses abstract state machine (ASM) paradigm. This allows describing the dynamic properties of the process descriptions in terms of partially ordered transition rules and transforming them to WSMO framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2015

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research studies the application of syntagmatic analysis of written texts in the language of Brazilian Portuguese as a methodology for the automatic creation of extractive summaries. The automation of abstracts, while linked to the area of natural language processing (PLN) is studying ways the computer can autonomously construct summaries of texts. For this we use as presupposed the idea that switch to the computer the way a language is structured, in our case the Brazilian Portuguese, it will help in the discovery of the most relevant sentences, and consequently build extractive summaries with higher informativeness. In this study, we propose the definition of a summarization method that automatically perform the syntagmatic analysis of texts and through them, to build an automatic summary. The phrases that make up the syntactic structures are then used to analyze the sentences of the text, so the count of these elements determines whether or not a sentence will compose the summary to be generated

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The growing need for food is something that worries the world, which has a population that is growing at a geometric progression while their resources grows at an arithmetic progression. To alleviate this problem there are some proposals, including increased food production or reduce waste thereof. Many studies have been conducted in the world in order to reduce food waste that can reach 40% of production, depending on the region. For this purpose techniques are used to retard degradation of foods, including drying. This paper presents a design of a hybrid fruit dryer that uses solar energy and electric energy with automation of the process. To accomplish drying tests were chosen Typical fruits with good acceptability as processed fruits. During the experiments were measured temperature values at different points. Were also measured humidity values, solar radiation and mass. A data acquisition system was built using a Arduino for obtaining temperatures. The data were sent to a program named Secador de Frutas, done in this work, to plot the same. The volume of the drying chamber was 423 liters and despite the unusual size test using mirrors to increase the incidence of direct radiation, showed that the drier is competitive when compared with other solar dryers produced in Hydraulic Machines and Solar Energy Laboratory (LMHES ) UFRN. The drier has been built at a cost of 3 to 5 times smaller than industrial dryers that operate with the same load of fruit. And the energy cost to produce dried fruits was more feasible compared with such dryers that use LPG as an energy source. However, the drying time was longer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In several areas of health professionals (pediatricians, nutritionists, orthopedists, endocrinologists, dentists, etc.) are used in the assessment of bone age to diagnose growth disorders in children. Through interviews with specialists in diagnostic imaging and research done in the literature, we identified the TW method - Tanner and Whitehouse as the most efficient. Even achieving better results than other methods, it is still not the most used, due to the complexity of their use. This work presents the possibility of automation of this method and therefore that its use more widespread. Also in this work, they are met two important steps in the evaluation of bone age, identification and classification of regions of interest. Even in the radiography in which the positioning of the hands were not suitable for TW method, the identification algorithm of the fingers showed good results. As the use AAM - Active Appearance Models showed good results in the identification of regions of interest even in radiographs with high contrast and brightness variation. It has been shown through appearance, good results in the classification of the epiphysis in their stages of development, being chosen the average epiphysis finger III (middle) to show the performance. The final results show an average percentage of 90% hit and misclassified, it was found that the error went away just one stage of the correct stage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tide gauge data are identified as legacy data given the radical transition between observation method and required output format associated with tide gauges over the 20th-century. Observed water level variation through tide-gauge records is regarded as the only significant basis for determining recent historical variation (decade to century) in mean sea-level and storm surge. There are limited tide gauge records that cover the 20th century, such that the Belfast (UK) Harbour tide gauge would be a strategic long-term (110 years) record, if the full paper-based records (marigrams) were digitally restructured to allow for consistent data analysis. This paper presents the methodology of extracting a consistent time series of observed water levels from the 5 different Belfast Harbour tide gauges’ positions/machine types, starting late 1901. Tide-gauge data was digitally retrieved from the original analogue (daily) records by scanning the marigrams and then extracting the sequential tidal elevations with graph-line seeking software (Ungraph™). This automation of signal extraction allowed the full Belfast series to be retrieved quickly, relative to any manual x–y digitisation of the signal. Restructuring variably lengthed tidal data sets to a consistent daily, monthly and annual file format was undertaken by project-developed software: Merge&Convert and MergeHYD allow consistent water level sampling both at 60 min (past standard) and 10 min intervals, the latter enhancing surge measurement. Belfast tide-gauge data have been rectified, validated and quality controlled (IOC 2006 standards). The result is a consistent annual-based legacy data series for Belfast Harbour that includes over 2 million tidal-level data observations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die Automatisierung logistischer Prozesse stellt aufgrund dynamischer Prozesseigenschaften und wirtschaftlicher Anforderungen eine große technische Herausforderung dar. Es besteht der Bedarf nach neuartigen hochflexiblen Automatisierungs- und Roboterlösungen, die in der Lage sind, variable Güter zu handhaben oder verschiedene Prozesse bzw. Funktionalitäten auszuführen. Im Rahmen dieses Beitrages wird die Steigerung der Flexibilität anhand von zwei konkreten Beispielen aus den Bereichen Stückguthandhabung und Materialflusstechnik adressiert.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research presented in this dissertation is aimed to the automation of the Fenton process. The Fenton reaction is finalized to the waste water pre-treatment in order to promote the abatement of the organic contaminants and make it more degradable. Reagents adopted are constituted by a mixture of iron ions and hydrogen peroxide and their effect is strictly influenced by several variables, such as: the reagents molar ratio and their quantities counterpoised to the substrate, temperature, pH, agitation, etc. Therefore, the optimization is far from being considered an easy procedure. The research was carried out using a batch configuration, through which the optimal [Fe2+]/[H2O2] and [substrate]/[H2O2] ratios were identified. Then, in order to improve the process, a semibatch configuration was performed. The preliminary results show that is possible to obtain a greater abatement efficiency for high organic burden using the semibatch configuration here proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Continuous delivery (CD) is a software engineering approach where the focus lays on creating a short delivery cycle by automating parts of the deployment pipeline which includes build, deploy-, test and release process. CD is based on that during development should be possible to always automatically generate a release based on the source code in its current state. One of CD's many advantages is that through continuous releases it allows you to get a quick feedback loop leading to faster and more efficient implementation of new functions, at the same time fixing errors. Although CD has many advantages, there are also several challenges a maintenance management project must manage in the transition to CD. These challenges may differ depending on the maturity level for a maintenance management project and what strengths and weaknesses the project has. Our research question was: "What challenges can a maintenance management project face in transition to Continuous delivery?" The purpose of this study is to describe Continuous delivery and the challenges a maintenance management project may face during a transition to Continuous delivery. A descriptive case study has been carried out with the data collection methods of interviews and documents. A situation analysis was created based on the collected data in a shape of a process model that represent the maintenance management projects release process. The processmodel was used as the basis of SWOT analysis and analysis by Rehn et al's Maturity Model. From these analyzes we found challenges of a maintenance management project may face in the transition to CD. The challenges are about customers and the management's attitude towards a transition to CD. But the biggest challenge is about automation of the deployment pipeline steps.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pour rester compétitives, les entreprises forestières cherchent à contrôler leurs coûts d’approvisionnement. Les abatteuses-façonneuses sont pourvues d’ordinateurs embarqués qui permettent le contrôle et l’automatisation de certaines fonctions. Or, ces technologies ne sont pas couramment utilisées et sont dans le meilleur des cas sous-utilisées. Tandis que l’industrie manifeste un intérêt grandissant pour l’utilisation de ces ordinateurs, peu de travaux de recherche ont porté sur l’apport en productivité et en conformité aux spécifications de façonnage découlant de l’usage de ces systèmes. L’objectif de l’étude était de mesurer les impacts des trois degrés d’automatisation (manuel, semi-automatique et automatique) sur la productivité (m3/hmp) et le taux de conformité des longueurs et des diamètre d’écimage des billes façonnées (%). La collecte de données s’est déroulée dans les secteurs de récolte de Produits forestiers résolu au nord du Lac St-Jean entre les mois de janvier et d’août 2015. Un dispositif en blocs complets a été mis en place pour chacun des cinq opérateurs ayant participé à l’étude. Un seuil de 5 % a été employé pour la réalisation de l’analyse des variances, après la réalisation de contrastes. Un seul cas a présenté un écart significatif de productivité attribuable au changement du degré d’automatisation employé, tandis qu’aucune différence significative n’a été détectée pour la conformité des diamètres d’écimage; des tendances ont toutefois été constatées. Les conformités de longueur obtenues par deux opérateurs ont présenté des écarts significatifs. Ceux-ci opérant sur deux équipements distincts, cela laisse entrevoir l’impact que peut aussi avoir l’opérateur sur le taux de conformité des longueurs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este relatório foi elaborado no âmbito do estágio curricular realizado na Direção de Serviços Técnicos e de Certificação do Instituto dos Vinhos do Douro e do Porto, I.P. Teve como principal objetivo apresentar propostas de alteração de procedimentos internos do laboratório do Instituto dos Vinhos do Douro e do Porto, I.P., no sentido de reduzir o tempo de resposta às solicitações, procurando uma melhor rentabilização tanto de equipamentos como de recursos humanos. Para atingir tal objetivo, foi elaborado o diagnóstico da situação, com base na informação recolhida na Direção de Serviços Técnicos e de Certificação, incluindo os dados fornecidos pelo software GLAB que dá apoio às operações realizadas no laboratório do Instituto dos Vinhos do Douro e do Porto. Além disso, foi efetuado o acompanhamento do circuito das amostras de produtos vínicos desde a sua receção, à análise nos setores do laboratório e posterior validação dos resultados. Dado que o estudo da entrada de amostras no setor Análise Mineral foi maioritariamente inconclusivo, apurou-se o custo por análise na determinação do chumbo e, ainda, foi realizada uma simulação determinar a despesa necessária para reduzir o número de amostras analisadas de cada vez que se liga o equipamento. Foi ainda comparado o custo da determinação do parâmetro furfural, análise que tanto pode ser realizada no setor Cromatografia Gasosa como no setor Cromatografia Líquida. Para isso, foram utlizados vários testes estatísticos. Com base na avaliação efetuada foram identificadas e propostas as seguintes oportunidades de melhoria: - Automação de uma das atividades realizadas no setor Físico-Química I, atividade esta necessária à preparação das análises; - Contrabalançar a sazonalidade verificada na receção de amostras dos clientes com as amostras provenientes da Direção de Serviços de Fiscalização e Controlo; - Inserção de algumas variáveis no software GLAB. É de realçar que a proposta referente à automação de uma das atividades realizadas no setor Físico-Química foi implementada pelo IVDP. Foram, também, identificadas propostas de futuras investigações.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The difficulties encountered in implementing large scale CM codes on multiprocessor systems are now fairly well understood. Despite the claims of shared memory architecture manufacturers to provide effective parallelizing compilers, these have not proved to be adequate for large or complex programs. Significant programmer effort is usually required to achieve reasonable parallel efficiencies on significant numbers of processors. The paradigm of Single Program Multi Data (SPMD) domain decomposition with message passing, where each processor runs the same code on a subdomain of the problem, communicating through exchange of messages, has for some time been demonstrated to provide the required level of efficiency, scalability, and portability across both shared and distributed memory systems, without the need to re-author the code into a new language or even to support differing message passing implementations. Extension of the methods into three dimensions has been enabled through the engineering of PHYSICA, a framework for supporting 3D, unstructured mesh and continuum mechanics modeling. In PHYSICA, six inspectors are used. Part of the challenge for automation of parallelization is being able to prove the equivalence of inspectors so that they can be merged into as few as possible.