941 resultados para Machine performance


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we consider transcripts which originated from a practical series of Turing’s Imitation Game which was held on 23rd June 2012 at Bletchley Park, England. In some cases the tests involved a 3-participant simultaneous comparison of two hidden entities whereas others were the result of a direct 2-participant interaction. Each of the transcripts considered here resulted in a human interrogator being fooled, by a machine, into concluding that they had been conversing with a human. Particular features of the conversation are highlighted, successful ploys on the part of each machine discussed and likely reasons for the interrogator being fooled are considered. Subsequent feedback from the interrogators involved is also included

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The sugarcane mechanized planting is becoming increasingly widespread in Brazil due to a higher operability and better working conditions offered to workers compared to other types of planting. Studies related to this topic are insufficient or scarce in Brazil. In this context, the aim of this study was to evaluate the operation quality of sugarcane mechanized planting in two operation shifts, by means of statistical process control. The mechanized planting was held on March 2012 and statistical design was completely randomized with two treatments, totaling 40 replications for the day shift and 40 replications for the night shift. The variables evaluated were: speed, engine rotation, engine oil pressure, water temperature of the engine, effective field capacity and the time consumption hourly and effective fuel. The use of statistical control charts showed that random intrinsic do not cause this process. The tractor alignment error showed outliers in the day and night shifts operations, indicating a possible delay in receiving the signal. The water temperature of the engine and the effective fuel consumption showed lower variability in nighttime operation with average values of 81°C and 22.66 L ha-1, respectively. The hourly fuel consumption had greater variability and consequently lower quality during the night of the operation, with an average consumption of 25.46 L h-1 while the day shift showed 26.86 L h-1.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The collect-and-place machine is one of the most widely used placement machines for assembling electronic components on the printed circuit boards (PCBs). Nevertheless, the number of researches concerning the optimisation of the machine performance is very few. This motivates us to study the component scheduling problem for this type of machine with the objective of minimising the total assembly time. The component scheduling problem is an integration of the component sequencing problem, that is, the sequencing of component placements; and the feeder arrangement problem, that is, the assignment of component types to feeders. To solve the component scheduling problem efficiently, a hybrid genetic algorithm is developed in this paper. A numerical example is used to compare the performance of the algorithm with different component grouping approaches and different population sizes.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

As machine tools continue to become increasingly repeatable and accurate, high-precision manufacturers may be tempted to consider how they might utilise machine tools as measurement systems. In this paper, we have explored this paradigm by attempting to repurpose state-of-the-art coordinate measuring machine Uncertainty Evaluating Software (UES) for a machine tool application. We performed live measurements on all the systems in question. Our findings have highlighted some gaps with UES when applied to machine tools, and we have attempted to identify the sources of variation which have led to discrepancies. Implications of this research include requirements to evolve the algorithms within the UES if it is to be adapted for on-machine measurement, improve the robustness of the input parameters, and most importantly, clarify expectations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Machine downtime, whether planned or unplanned, is intuitively costly to manufacturing organisations, but is often very difficult to quantify. The available literature showed that costing processes are rarely undertaken within manufacturing organisations. Where cost analyses have been undertaken, they generally have only valued a small proportion of the affected costs, leading to an overly conservative estimate. This thesis aimed to develop a cost of downtime model, with particular emphasis on the application of the model to Australia Post’s Flat Mail Optical Character Reader (FMOCR). The costing analysis determined a cost of downtime of $5,700,000 per annum, or an average cost of $138 per operational hour. The second section of this work focused on the use of the cost of downtime to objectively determine areas of opportunity for cost reduction on the FMOCR. This was the first time within Post that maintenance costs were considered along side of downtime for determining machine performance. Because of this, the results of the analysis revealed areas which have historically not been targeted for cost reduction. Further exploratory work was undertaken on the Flats Lift Module (FLM) and Auto Induction Station (AIS) Deceleration Belts through the comparison of the results against two additional FMOCR analysis programs. This research has demonstrated the development of a methodical and quantifiable cost of downtime for the FMOCR. This has been the first time that Post has endeavoured to examine the cost of downtime. It is also one of the very few methodologies for valuing downtime costs that has been proposed in literature. The work undertaken has also demonstrated how the cost of downtime can be incorporated into machine performance analysis with specific application to identifying high costs modules. The outcome of this report has both been the methodology for costing downtime, as well as a list of areas for cost reduction. In doing so, this thesis has outlined the two key deliverables presented at the outset of the research.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Since 1993 we have been working on the automation of dragline excavators, the largest earthmoving machines that exist. Recently we completed a large-scale experimental program where the automation system was used for production purposes over a two week period and moved over 200,000 tonnes of overburden. This is a landmark achievement in the history of automated excavation. In this paper we briefly describe the robotic system and how it works cooperatively with the machine operator. We then describe our methodology for gauging machine performance, analyze results from the production trial and comment on the effectiveness of the system that we have created. © Springer-Verlag Berlin Heidelberg 2006.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In order to carry out the Multi-annual Guidance Programmes (MGP) the national fishing fleets of the EU were divided into mostly homogenous fleet segments. The current paper describes the single segments of thc German fishing fleet and summarizes their characteristics such as vessel capacity (tonnage in GRT), machine performance (power in kW) and vessel size (total length in m). Another table lists the averaged landings separated per stock and segment for the period from 1990 to 1994.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Soil conditioning consists of mixing and remolding the natural material during the mechanical excavation of tunnels, generally at low depth, with additives, in order to obtain suitable properties of plasticity and consistency for the excavated material, so becoming able to apply a counterpressure against natural earth pressure and groundwater flow towards the excavation chamber. The assessment and the control of the soil parameters and of machine performance are fundamental for a regular and safe excavation, also with regards to surface stability. This paper mainly focus on testing approach aimed to the proper soil conditioning with EPB shields, whose results have been validated at real scale. The influence of the water content and the amount of conditioning foam has been studied by the Authors. A proper definition of conditioning parameters can allow to extend the application field of Earth Pressure Balance (EPB) tunnel machines to various grain soil distribution, even in weak rock formations (e.g. siltstone or flysch). Importance of conditioning is reflected also on the possibility of a proper spoil disposal or better for its reuse.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study aims to evaluate the use of Varian radiotherapy dynamic treatment log (DynaLog) files to verify IMRT plan delivery as part of a routine quality assurance procedure. Delivery accuracy in terms of machine performance was quantified by multileaf collimator (MLC) position errors and fluence delivery accuracy for patients receiving intensity modulated radiation therapy (IMRT) treatment. The relationship between machine performance and plan complexity, quantified by the modulation complexity score (MCS) was also investigated. Actual MLC positions and delivered fraction of monitor units (MU), recorded every 50 ms during IMRT delivery, were extracted from the DynaLog files. The planned MLC positions and fractional MU were taken from the record and verify system MLC control file. Planned and delivered beam data were compared to determine leaf position errors with and without the overshoot effect. Analysis was also performed on planned and actual fluence maps reconstructed from the MLC control file and delivered treatment log files respectively. This analysis was performed for all treatment fractions for 5 prostate, 5 prostate and pelvic node (PPN) and 5 head and neck (H&N) IMRT plans, totalling 82 IMRT fields in ∼5500 DynaLog files. The root mean square (RMS) leaf position errors without the overshoot effect were 0.09, 0.26, 0.19 mm for the prostate, PPN and H&N plans respectively, which increased to 0.30, 0.39 and 0.30 mm when the overshoot effect was considered. Average errors were not affected by the overshoot effect and were 0.05, 0.13 and 0.17 mm for prostate, PPN and H&N plans respectively. The percentage of pixels passing fluence map gamma analysis at 3%/3 mm was 99.94 ± 0.25%, which reduced to 91.62 ± 11.39% at 1%/1 mm criterion. Leaf position errors, but not gamma passing rate, were directly related to plan complexity as determined by the MCS. Site specific confidence intervals for average leaf position errors were set at -0.03-0.12 mm for prostate and -0.02-0.28 mm for more complex PPN and H&N plans. For all treatment sites confidence intervals for RMS errors with the overshoot was set at 0-0.50 mm and for the percentage of pixels passing a gamma analysis at 1%/1 mm a confidence interval of 68.83% was set also for all treatment sites. This work demonstrates the successful implementation of treatment log files to validate IMRT deliveries and how dynamic log files can diagnose delivery errors not possible with phantom based QC. Machine performance was found to be directly related to plan complexity but this is not the dominant determinant of delivery accuracy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ce mémoire traite d'abord du problème de la modélisation de l'interprétation des pianistes à l'aide de l'apprentissage machine. Il s'occupe ensuite de présenter de nouveaux modèles temporels qui utilisent des auto-encodeurs pour améliorer l'apprentissage de séquences. Dans un premier temps, nous présentons le travail préalablement fait dans le domaine de la modélisation de l'expressivité musicale, notamment les modèles statistiques du professeur Widmer. Nous parlons ensuite de notre ensemble de données, unique au monde, qu'il a été nécessaire de créer pour accomplir notre tâche. Cet ensemble est composé de 13 pianistes différents enregistrés sur le fameux piano Bösendorfer 290SE. Enfin, nous expliquons en détail les résultats de l'apprentissage de réseaux de neurones et de réseaux de neurones récurrents. Ceux-ci sont appliqués sur les données mentionnées pour apprendre les variations expressives propres à un style de musique. Dans un deuxième temps, ce mémoire aborde la découverte de modèles statistiques expérimentaux qui impliquent l'utilisation d'auto-encodeurs sur des réseaux de neurones récurrents. Pour pouvoir tester la limite de leur capacité d'apprentissage, nous utilisons deux ensembles de données artificielles développées à l'Université de Toronto.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The increasing of the number of attacks in the computer networks has been treated with the increment of the resources that are applied directly in the active routers equip-ments of these networks. In this context, the firewalls had been consolidated as essential elements in the input and output control process of packets in a network. With the advent of intrusion detectors systems (IDS), efforts have been done in the direction to incorporate packets filtering based in standards of traditional firewalls. This integration incorporates the IDS functions (as filtering based on signatures, until then a passive element) with the already existing functions in firewall. In opposite of the efficiency due this incorporation in the blockage of signature known attacks, the filtering in the application level provokes a natural retard in the analyzed packets, and it can reduce the machine performance to filter the others packets because of machine resources demand by this level of filtering. This work presents models of treatment for this problem based in the packets re-routing for analysis by a sub-network with specific filterings. The suggestion of implementa- tion of this model aims reducing the performance problem and opening a space for the consolidation of scenes where others not conventional filtering solutions (spam blockage, P2P traffic control/blockage, etc.) can be inserted in the filtering sub-network, without inplying in overload of the main firewall in a corporative network

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Agronomia (Ciência do Solo) - FCAV

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Machine translation systems have been increasingly used for translation of large volumes of specialized texts. The efficiency of these systems depends directly on the implementation of strategies for controlling lexical use of source texts as a way to guarantee machine performance and, ultimately, human revision and post-edition work. This paper presents a brief history of application of machine translation, introduces the concept of lexicon and ambiguity and focuses on some of the lexical control strategies presently used, discussing their possible implications for the production and reading of specialized texts.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Agronomia (Ciência do Solo) - FCAV