898 resultados para desig automation of robots


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work studies and develops a machine automation for tyre truing. It is discussed industry development, industrial processes automation, tyre-recycling process, advantages of tyre reuse for preservation of the environment and probable gains from the automation of part of the tyre-recycling process. In this text, it is detailed the work done to configure, program and optimize the truing process through automation components as CNC, PLC and drives. Tests and simulations are performed to determine the payback necessary period in productivity and profit gains

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The exceptional advance of information technology and computer application to the mineral sector has allowed the automation of several processes of the mineral value chain. ERP systems (Enterprise Resource Planning) provided the platform for the efficient integration of all support activities of the mineral value chain. Despite all advances gathered with the application of computers, it was not possible to date, to effectively integrate the primary activities of the mineral value chain. The main reason for that are the uncertainties present in the productive process, which are intrinsic to the business, and the difficulty to quantify and qualify the benefits obtained with this integration due to the lack of a clear definition of the key performance indicators (KPIs). This work presents an analysis of the ERP systems application in Brazilian mining, identifies the KPIs of some of the most important Brazilian mining companies, and discusses the importance of mapping and measuring these indicators for the effective. management of the mining business.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional abduction imposes as a precondition the restriction that the background information may not derive the goal data. In first-order logic such precondition is, in general, undecidable. To avoid such problem, we present a first-order cut-based abduction method, which has KE-tableaux as its underlying inference system. This inference system allows for the automation of non-analytic proofs in a tableau setting, which permits a generalization of traditional abduction that avoids the undecidable precondition problem. After demonstrating the correctness of the method, we show how this method can be dynamically iterated in a process that leads to the construction of non-analytic first-order proofs and, in some terminating cases, to refutations as well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nuclear magnetic resonance (NMR) is one of the most versatile analytical techniques for chemical, biochemical and medical applications. Despite this great success, NMR is seldom used as a tool in industrial applications. The first application of NMR in flowing samples was published in 1951. However, only in the last ten years Flow NMR has gained momentum and new and potential applications have been proposed. In this review we present the historical evolution of flow or online NMR spectroscopy and imaging, and current developments for use in the automation of industrial processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent statistics have demonstrated that two of the most important causes of failures of the UAVs (Uninhabited Aerial Vehicle) missions are related to the low level of decisional autonomy of vehicles and to the man machine interface. Therefore, a relevant issue is to design a display/controls architecture which allows the efficient interaction between the operator and the remote vehicle and to develop a level of automation which allows the vehicle the decision about change in mission. The research presented in this paper focuses on a modular man-machine interface simulator for the UAV control, which simulates UAV missions, developed to experiment solution to this problem. The main components of the simulator are an advanced interface and a block defined automation, which comprehend an algorithm that implements the level of automation of the system. The simulator has been designed and developed following a user-centred design approach in order to take into account the operator’s needs in the communication with the vehicle. The level of automation has been developed following the supervisory control theory which says that the human became a supervisor who sends high level commands, such as part of mission, target, constraints, in then-rule, while the vehicle receives, comprehends and translates such commands into detailed action such as routes or action on the control system. In order to allow the vehicle to calculate and recalculate the safe and efficient route, in term of distance, time and fuel a 3D planning algorithm has been developed. It is based on considering UASs representative of real world systems as objects moving in a virtual environment (terrain, obstacles, and no fly zones) which replicates the airspace. Original obstacle avoidance strategies have been conceived in order to generate mission planes which are consistent with flight rules and with the vehicle performance constraints. The interface is based on a touch screen, used to send high level commands to the vehicle, and a 3D Virtual Display which provides a stereoscopic and augmented visualization of the complex scenario in which the vehicle operates. Furthermore, it is provided with an audio feedback message generator. Simulation tests have been conducted with pilot trainers to evaluate the reliability of the algorithm and the effectiveness and efficiency of the interface in supporting the operator in the supervision of an UAV mission. Results have revealed that the planning algorithm calculate very efficient routes in few seconds, an adequate level of workload is required to command the vehicle and that the 3D based interface provides the operator with a good sense of presence and enhances his awareness of the mission scenario and of the vehicle under his control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visual tracking is the problem of estimating some variables related to a target given a video sequence depicting the target. Visual tracking is key to the automation of many tasks, such as visual surveillance, robot or vehicle autonomous navigation, automatic video indexing in multimedia databases. Despite many years of research, long term tracking in real world scenarios for generic targets is still unaccomplished. The main contribution of this thesis is the definition of effective algorithms that can foster a general solution to visual tracking by letting the tracker adapt to mutating working conditions. In particular, we propose to adapt two crucial components of visual trackers: the transition model and the appearance model. The less general but widespread case of tracking from a static camera is also considered and a novel change detection algorithm robust to sudden illumination changes is proposed. Based on this, a principled adaptive framework to model the interaction between Bayesian change detection and recursive Bayesian trackers is introduced. Finally, the problem of automatic tracker initialization is considered. In particular, a novel solution for categorization of 3D data is presented. The novel category recognition algorithm is based on a novel 3D descriptors that is shown to achieve state of the art performances in several applications of surface matching.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il problema dell'antibiotico-resistenza è un problema di sanità pubblica per affrontare il quale è necessario un sistema di sorveglianza basato sulla raccolta e l'analisi dei dati epidemiologici di laboratorio. Il progetto di dottorato è consistito nello sviluppo di una applicazione web per la gestione di tali dati di antibiotico sensibilità di isolati clinici utilizzabile a livello di ospedale. Si è creata una piattaforma web associata a un database relazionale per avere un’applicazione dinamica che potesse essere aggiornata facilmente inserendo nuovi dati senza dover manualmente modificare le pagine HTML che compongono l’applicazione stessa. E’ stato utilizzato il database open-source MySQL in quanto presenta numerosi vantaggi: estremamente stabile, elevate prestazioni, supportato da una grande comunità online ed inoltre gratuito. Il contenuto dinamico dell’applicazione web deve essere generato da un linguaggio di programmazione tipo “scripting” che automatizzi operazioni di inserimento, modifica, cancellazione, visualizzazione di larghe quantità di dati. E’ stato scelto il PHP, linguaggio open-source sviluppato appositamente per la realizzazione di pagine web dinamiche, perfettamente utilizzabile con il database MySQL. E’ stata definita l’architettura del database creando le tabelle contenenti i dati e le relazioni tra di esse: le anagrafiche, i dati relativi ai campioni, microrganismi isolati e agli antibiogrammi con le categorie interpretative relative al dato antibiotico. Definite tabelle e relazioni del database è stato scritto il codice associato alle funzioni principali: inserimento manuale di antibiogrammi, importazione di antibiogrammi multipli provenienti da file esportati da strumenti automatizzati, modifica/eliminazione degli antibiogrammi precedenti inseriti nel sistema, analisi dei dati presenti nel database con tendenze e andamenti relativi alla prevalenza di specie microbiche e alla chemioresistenza degli stessi, corredate da grafici. Lo sviluppo ha incluso continui test delle funzioni via via implementate usando reali dati clinici e sono stati introdotti appositi controlli e l’introduzione di una semplice e pulita veste grafica.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2010 more than 600 radiocarbon samples were measured with the gas ion source at the MIni CArbon DAting System (MICADAS) at ETH Zurich and the number of measurements is rising quickly. While most samples contain less than 50 mu g C at present, the gas ion source is attractive as well for larger samples because the time-consuming graphitization is omitted. Additionally, modern samples are now measured down to 5 per-mill counting statistics in less than 30 min with the recently improved gas ion source. In the versatile gas handling system, a stepping-motor-driven syringe presses a mixture of helium and sample CO2 into the gas ion source, allowing continuous and stable measurements of different kinds of samples. CO2 can be provided in four different ways to the versatile gas interface. As a primary method. CO2 is delivered in glass or quartz ampoules. In this case, the CO2 is released in an automated ampoule cracker with 8 positions for individual samples. Secondly, OX-1 and blank gas in helium can be provided to the syringe by directly connecting gas bottles to the gas interface at the stage of the cracker. Thirdly, solid samples can be combusted in an elemental analyzer or in a thermo-optical OC/EC aerosol analyzer where the produced CO2 is transferred to the syringe via a zeolite trap for gas concentration. As a fourth method, CO2 is released from carbonates with phosphoric acid in septum-sealed vials and loaded onto the same trap used for the elemental analyzer. All four methods allow complete automation of the measurement, even though minor user input is presently still required. Details on the setup, versatility and applications of the gas handling system are given. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Cloud computing service emerged as an essential component of the Enterprise {IT} infrastructure. Migration towards a full range and large-scale convergence of Cloud and network services has become the current trend for addressing requirements of the Cloud environment. Our approach takes the infrastructure as a service paradigm to build converged virtual infrastructures, which allow offering tailored performance and enable multi-tenancy over a common physical infrastructure. Thanks to virtualization, new exploitation activities of the physical infrastructures may arise for both transport network and Data Centres services. This approach makes network and Data Centres’ resources dedicated to Cloud Computing to converge on the same flexible and scalable level. The work presented here is based on the automation of the virtual infrastructure provisioning service. On top of the virtual infrastructures, a coordinated operation and control of the different resources is performed with the objective of automatically tailoring connectivity services to the Cloud service dynamics. Furthermore, in order to support elasticity of the Cloud services through the optical network, dynamic re-planning features have been provided to the virtual infrastructure service, which allows scaling up or down existing virtual infrastructures to optimize resource utilisation and dynamically adapt to users’ demands. Thus, the dynamic re-planning of the service becomes key component for the coordination of Cloud and optical network resource in an optimal way in terms of resource utilisation. The presented work is complemented with a use case of the virtual infrastructure service being adopted in a distributed Enterprise Information System, that scales up and down as a function of the application requests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A CE system featuring an array of 16 contactless conductivity detectors was constructed. The detectors were arranged along 70 cm length of a capillary with 100 cm total length and allow the monitoring of separation processes. As the detectors cannot be accommodated on a conventional commercial instrument, a purpose built set-up employing a sequential injection manifold had to be employed for automation of the fluid handling. Conductivity measurements can be considered universal for electrophoresis and thus any changes in ionic composition can be monitored. The progress of the separation of Na(+) and K(+) is demonstrated. The potential of the system to the study of processes in CZE is shown in two examples. The first demonstrates the differences in the developments of peaks originating from a sample plug with a purely aqueous background to that of a plug containing the analyte ions in the buffer. The second example visualizes the opposite migration of cations and anions from a sample plug that had been placed in the middle of the capillary.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Organizing and archiving statistical results and processing a subset of those results for publication are important and often underestimated issues in conducting statistical analyses. Because automation of these tasks is often poor, processing results produced by statistical packages is quite laborious and vulnerable to error. I will therefore present a new package called estout that facilitates and automates some of these tasks. This new command can be used to produce regression tables for use with spreadsheets, LaTeX, HTML, or word processors. For example, the results for multiple models can be organized in spreadsheets and can thus be archived in an orderly manner. Alternatively, the results can be directly saved as a publication-ready table for inclusion in, for example, a LaTeX document. estout is implemented as a wrapper for estimates table but has many additional features, such as support for mfx. However, despite its flexibility, estout is—I believe—still very straightforward and easy to use. Furthermore, estout can be customized via so-called defaults files. A tool to make available supplementary statistics called estadd is also provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El acceso a la información para la comunidad usuaria de bibliotecas y centros de documentación, que ha estado siempre presente en el ideario del campo de la bibliotecología, se ha afianzado en los últimos tiempos en los discursos de la sociedad de la información, especialmente a través de la incorporación de nuevas tecnologías capaces de mediar-que median en las vinculaciones entre bibliotecarios y usuarios. En ese sentido, este trabajo indaga por un lado la cuestión de la incorporación y el uso de las tecnologías de la información y la comunicación en los procesos bibliotecarios llevados a cabo en las bibliotecas que están bajo la dirección de egresados y egresadas de la carrera de Bibliotecología de la Facultad de Humanidades y Ciencias de la Educación de la Universidad Nacional de La Plata (FAHCE-UNLP), Argentina. Por otro, rastrea los imaginarios tecnológicos que poseen planteando una proyección a 10 años sobre estas mismas unidades de información. Para ello, la búsqueda de quiénes están ejerciendo cargos o funciones de dirección en distintas unidades de información dio como resultado una lista de 34 personas. A ellas se les envió vía e.mail un cuestionario de tipo semiestructurado que indagaba sobre 1- sus datos personales y de pertenencia institucional; 2- el estado de situación de los procesos de automatización de las bibliotecas en las que trabajan; 3- la aplicación de tecnologías en los distintos procesos; 4- la formación de grado obtenida en temas vinculados al uso de tecnologías, así como también los cursos y los seminarios realizados sobre estos tópicos en los últimos cinco años; 5- las perspectivas de desarrollo futuro que imagina este conjunto de profesionales para las bibliotecas que dirigen. Este último aspecto se retomó con entrevistas semi estructuradas en profundidad que pretendían situarlos-las dentro de 10 años para pensar aspectos relacionados con los posibles cambios en la comunicación con los usuarios y las instalaciones, la composición de la colección bibliográfica, los servicios documentales a prestar y otros aspectos vinculados a la gestión de las bibliotecas y el acceso a la información de la comunidad usuaria. El análisis de la información obtenida a través tanto de las encuestas como de las entrevistas plantea dilucidar la posibilidad de concreción del acceso a la información de la comunidad usuaria en el presente o en el futuro cercano en-a través de esas bibliotecas

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El acceso a la información para la comunidad usuaria de bibliotecas y centros de documentación, que ha estado siempre presente en el ideario del campo de la bibliotecología, se ha afianzado en los últimos tiempos en los discursos de la sociedad de la información, especialmente a través de la incorporación de nuevas tecnologías capaces de mediar-que median en las vinculaciones entre bibliotecarios y usuarios. En ese sentido, este trabajo indaga por un lado la cuestión de la incorporación y el uso de las tecnologías de la información y la comunicación en los procesos bibliotecarios llevados a cabo en las bibliotecas que están bajo la dirección de egresados y egresadas de la carrera de Bibliotecología de la Facultad de Humanidades y Ciencias de la Educación de la Universidad Nacional de La Plata (FAHCE-UNLP), Argentina. Por otro, rastrea los imaginarios tecnológicos que poseen planteando una proyección a 10 años sobre estas mismas unidades de información. Para ello, la búsqueda de quiénes están ejerciendo cargos o funciones de dirección en distintas unidades de información dio como resultado una lista de 34 personas. A ellas se les envió vía e.mail un cuestionario de tipo semiestructurado que indagaba sobre 1- sus datos personales y de pertenencia institucional; 2- el estado de situación de los procesos de automatización de las bibliotecas en las que trabajan; 3- la aplicación de tecnologías en los distintos procesos; 4- la formación de grado obtenida en temas vinculados al uso de tecnologías, así como también los cursos y los seminarios realizados sobre estos tópicos en los últimos cinco años; 5- las perspectivas de desarrollo futuro que imagina este conjunto de profesionales para las bibliotecas que dirigen. Este último aspecto se retomó con entrevistas semi estructuradas en profundidad que pretendían situarlos-las dentro de 10 años para pensar aspectos relacionados con los posibles cambios en la comunicación con los usuarios y las instalaciones, la composición de la colección bibliográfica, los servicios documentales a prestar y otros aspectos vinculados a la gestión de las bibliotecas y el acceso a la información de la comunidad usuaria. El análisis de la información obtenida a través tanto de las encuestas como de las entrevistas plantea dilucidar la posibilidad de concreción del acceso a la información de la comunidad usuaria en el presente o en el futuro cercano en-a través de esas bibliotecas

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El acceso a la información para la comunidad usuaria de bibliotecas y centros de documentación, que ha estado siempre presente en el ideario del campo de la bibliotecología, se ha afianzado en los últimos tiempos en los discursos de la sociedad de la información, especialmente a través de la incorporación de nuevas tecnologías capaces de mediar-que median en las vinculaciones entre bibliotecarios y usuarios. En ese sentido, este trabajo indaga por un lado la cuestión de la incorporación y el uso de las tecnologías de la información y la comunicación en los procesos bibliotecarios llevados a cabo en las bibliotecas que están bajo la dirección de egresados y egresadas de la carrera de Bibliotecología de la Facultad de Humanidades y Ciencias de la Educación de la Universidad Nacional de La Plata (FAHCE-UNLP), Argentina. Por otro, rastrea los imaginarios tecnológicos que poseen planteando una proyección a 10 años sobre estas mismas unidades de información. Para ello, la búsqueda de quiénes están ejerciendo cargos o funciones de dirección en distintas unidades de información dio como resultado una lista de 34 personas. A ellas se les envió vía e.mail un cuestionario de tipo semiestructurado que indagaba sobre 1- sus datos personales y de pertenencia institucional; 2- el estado de situación de los procesos de automatización de las bibliotecas en las que trabajan; 3- la aplicación de tecnologías en los distintos procesos; 4- la formación de grado obtenida en temas vinculados al uso de tecnologías, así como también los cursos y los seminarios realizados sobre estos tópicos en los últimos cinco años; 5- las perspectivas de desarrollo futuro que imagina este conjunto de profesionales para las bibliotecas que dirigen. Este último aspecto se retomó con entrevistas semi estructuradas en profundidad que pretendían situarlos-las dentro de 10 años para pensar aspectos relacionados con los posibles cambios en la comunicación con los usuarios y las instalaciones, la composición de la colección bibliográfica, los servicios documentales a prestar y otros aspectos vinculados a la gestión de las bibliotecas y el acceso a la información de la comunidad usuaria. El análisis de la información obtenida a través tanto de las encuestas como de las entrevistas plantea dilucidar la posibilidad de concreción del acceso a la información de la comunidad usuaria en el presente o en el futuro cercano en-a través de esas bibliotecas

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Postestimation processing and formatting of regression estimates for input into document tables are tasks that many of us have to do. However, processing results by hand can be laborious, and is vulnerable to error. There are therefore many benefits to automation of these tasks while at the same time retaining user flexibility in terms of output format. The estout package meets these needs. estout assembles a table of coefficients, "significance stars", summary statistics, standard errors, t/z statistics, p-values, confidence intervals, and other statistics calculated for up to twenty models previously fitted and stored by estimates store. It then writes the table to the Stata log and/or to a text file. The estimates are formatted optionally in several styles: html, LaTeX, or tab-delimited (for input into MS Excel or Word). There are a large number of options regarding which output is formatted and how. This talk will take users through a range of examples, from relatively basic simple applications to complex ones.