36 resultados para Machine-tools.
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
With the relentless quest for improved performance driving ever tighter tolerances for manufacturing, machine tools are sometimes unable to meet the desired requirements. One option to improve the tolerances of machine tools is to compensate for their errors. Among all possible sources of machine tool error, thermally induced errors are, in general for newer machines, the most important. The present work demonstrates the evaluation and modelling of the behaviour of the thermal errors of a CNC cylindrical grinding machine during its warm-up period.
Resumo:
This paper proposes an architecture for machining process and production monitoring to be applied in machine tools with open Computer numerical control (CNC). A brief description of the advantages of using open CNC for machining process and production monitoring is presented with an emphasis on the CNC architecture using a personal computer (PC)-based human-machine interface. The proposed architecture uses the CNC data and sensors to gather information about the machining process and production. It allows the development of different levels of monitoring systems with mininium investment, minimum need for sensor installation, and low intrusiveness to the process. Successful examples of the utilization of this architecture in a laboratory environment are briefly described. As a Conclusion, it is shown that a wide range of monitoring solutions can be implemented in production processes using the proposed architecture.
Resumo:
Compliant mechanisms can achieve a specified motion as a mechanism without relying on the use of joints and pins. They have broad application in precision mechanical devices and Micro-Electro Mechanical Systems (MEMS) but may lose accuracy and produce undesirable displacements when subjected to temperature changes. These undesirable effects can be reduced by using sensors in combination with control techniques and/or by applying special design techniques to reduce such undesirable effects at the design stage, a process generally termed ""design for precision"". This paper describes a design for precision method based on a topology optimization method (TOM) for compliant mechanisms that includes thermal compensation features. The optimization problem emphasizes actuator accuracy and it is formulated to yield optimal compliant mechanism configurations that maximize the desired output displacement when a force is applied, while minimizing undesirable thermal effects. To demonstrate the effectiveness of the method, two-dimensional compliant mechanisms are designed considering thermal compensation, and their performance is compared with compliant mechanisms designs that do not consider thermal compensation. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Species` potential distribution modelling consists of building a representation of the fundamental ecological requirements of a species from biotic and abiotic conditions where the species is known to occur. Such models can be valuable tools to understand the biogeography of species and to support the prediction of its presence/absence considering a particular environment scenario. This paper investigates the use of different supervised machine learning techniques to model the potential distribution of 35 plant species from Latin America. Each technique was able to extract a different representation of the relations between the environmental conditions and the distribution profile of the species. The experimental results highlight the good performance of random trees classifiers, indicating this particular technique as a promising candidate for modelling species` potential distribution. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Complex networks have been increasingly used in text analysis, including in connection with natural language processing tools, as important text features appear to be captured by the topology and dynamics of the networks. Following previous works that apply complex networks concepts to text quality measurement, summary evaluation, and author characterization, we now focus on machine translation (MT). In this paper we assess the possible representation of texts as complex networks to evaluate cross-linguistic issues inherent in manual and machine translation. We show that different quality translations generated by NIT tools can be distinguished from their manual counterparts by means of metrics such as in-(ID) and out-degrees (OD), clustering coefficient (CC), and shortest paths (SP). For instance, we demonstrate that the average OD in networks of automatic translations consistently exceeds the values obtained for manual ones, and that the CC values of source texts are not preserved for manual translations, but are for good automatic translations. This probably reflects the text rearrangements humans perform during manual translation. We envisage that such findings could lead to better NIT tools and automatic evaluation metrics.
Resumo:
Due to the imprecise nature of biological experiments, biological data is often characterized by the presence of redundant and noisy data. This may be due to errors that occurred during data collection, such as contaminations in laboratorial samples. It is the case of gene expression data, where the equipments and tools currently used frequently produce noisy biological data. Machine Learning algorithms have been successfully used in gene expression data analysis. Although many Machine Learning algorithms can deal with noise, detecting and removing noisy instances from the training data set can help the induction of the target hypothesis. This paper evaluates the use of distance-based pre-processing techniques for noise detection in gene expression data classification problems. This evaluation analyzes the effectiveness of the techniques investigated in removing noisy data, measured by the accuracy obtained by different Machine Learning classifiers over the pre-processed data.
Resumo:
This work proposes a new approach using a committee machine of artificial neural networks to classify masses found in mammograms as benign or malignant. Three shape factors, three edge-sharpness measures, and 14 texture measures are used for the classification of 20 regions of interest (ROIs) related to malignant tumors and 37 ROIs related to benign masses. A group of multilayer perceptrons (MLPs) is employed as a committee machine of neural network classifiers. The classification results are reached by combining the responses of the individual classifiers. Experiments involving changes in the learning algorithm of the committee machine are conducted. The classification accuracy is evaluated using the area A. under the receiver operating characteristics (ROC) curve. The A, result for the committee machine is compared with the A, results obtained using MLPs and single-layer perceptrons (SLPs), as well as a linear discriminant analysis (LDA) classifier Tests are carried out using the student's t-distribution. The committee machine classifier outperforms the MLP SLP, and LDA classifiers in the following cases: with the shape measure of spiculation index, the A, values of the four methods are, in order 0.93, 0.84, 0.75, and 0.76; and with the edge-sharpness measure of acutance, the values are 0.79, 0.70, 0.69, and 0.74. Although the features with which improvement is obtained with the committee machines are not the same as those that provided the maximal value of A(z) (A(z) = 0.99 with some shape features, with or without the committee machine), they correspond to features that are not critically dependent on the accuracy of the boundaries of the masses, which is an important result. (c) 2008 SPIE and IS&T.
Resumo:
Balance problems in hemiparetic patients after stroke can be caused by different impairments in the physiological systems involved in Postural control, including sensory afferents, movement strategies, biomechanical constraints, cognitive processing, and perception of verticality. Balance impairments and disabilities must be appropriately addressed. This article reviews the most common balance abnormalities in hemiparetic patients with stroke and the main tools used to diagnose them.
Resumo:
We study the star/galaxy classification efficiency of 13 different decision tree algorithms applied to photometric objects in the Sloan Digital Sky Survey Data Release Seven (SDSS-DR7). Each algorithm is defined by a set of parameters which, when varied, produce different final classification trees. We extensively explore the parameter space of each algorithm, using the set of 884,126 SDSS objects with spectroscopic data as the training set. The efficiency of star-galaxy separation is measured using the completeness function. We find that the Functional Tree algorithm (FT) yields the best results as measured by the mean completeness in two magnitude intervals: 14 <= r <= 21 (85.2%) and r >= 19 (82.1%). We compare the performance of the tree generated with the optimal FT configuration to the classifications provided by the SDSS parametric classifier, 2DPHOT, and Ball et al. We find that our FT classifier is comparable to or better in completeness over the full magnitude range 15 <= r <= 21, with much lower contamination than all but the Ball et al. classifier. At the faintest magnitudes (r > 19), our classifier is the only one that maintains high completeness (> 80%) while simultaneously achieving low contamination (similar to 2.5%). We also examine the SDSS parametric classifier (psfMag - modelMag) to see if the dividing line between stars and galaxies can be adjusted to improve the classifier. We find that currently stars in close pairs are often misclassified as galaxies, and suggest a new cut to improve the classifier. Finally, we apply our FT classifier to separate stars from galaxies in the full set of 69,545,326 SDSS photometric objects in the magnitude range 14 <= r <= 21.
Resumo:
The aim of the study was to evaluate the possible relationships between stress tolerance, training load, banal infections and salivary parameters during 4 weeks of regular training in fifteen basketball players. The Daily Analysis of Life Demands for Athletes` questionnaire (sources and symptoms of stress) and the Wisconsin Upper Respiratory Symptom Survey were used on a weekly basis. Salivary cortisol and salivary immunoglobulin A (SIgA) were collected at the beginning (before) and after the study, and measured by enzyme-linked immunosorbent assay (ELISA). Ratings of perceived exertion (training load) were also obtained. The results from ANOVA with repeated measures showed greater training loads, number of upper respiratory tract infection episodes and negative sensation to both symptoms and sources of stress, at week 2 (p < 0.05). Significant increases in cortisol levels and decreases in SIgA secretion rate were noted (before to after). Negative sensations to symptoms of stress at week 4 were inversely and significantly correlated with SIgA secretion rate. A positive and significant relationship between sources and symptoms of stress at week 4 and cortisol levels were verified. In summary, an approach incorporating in conjunction psychometric tools and salivary biomarkers could be an efficient means of monitoring reaction to stress in sport. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
The main purpose of this paper is to present architecture of automated system that allows monitoring and tracking in real time (online) the possible occurrence of faults and electromagnetic transients observed in primary power distribution networks. Through the interconnection of this automated system to the utility operation center, it will be possible to provide an efficient tool that will assist in decisionmaking by the Operation Center. In short, the desired purpose aims to have all tools necessary to identify, almost instantaneously, the occurrence of faults and transient disturbances in the primary power distribution system, as well as to determine its respective origin and probable location. The compilations of results from the application of this automated system show that the developed techniques provide accurate results, identifying and locating several occurrences of faults observed in the distribution system.
Resumo:
Conventional threading operations involve two distinct machining processes: drilling and threading. Therefore, it is time consuming for the tools must be changed and the workpiece has to be moved to another machine. This paper presents an analysis of the combined process (drilling followed by threading) using a single tool for both operations: the tap-milling tool. Before presenting the methodology used to evaluate this hybrid tool, the ODS (operating deflection shapes) basics is shortly described. ODS and finite element modeling (FEM) were used during this research to optimize the process aiming to achieve higher stable machining conditions and increasing the tool life. Both methods allowed the determination of the natural frequencies and displacements of the machining center and optimize the workpiece fixture system. The results showed that there is an excellent correlation between the dynamic stability of the machining center-tool holder and the tool life, avoiding a tool premature catastrophic failure. Nevertheless, evidence showed that the tool is very sensitive to work conditions. Undoubtedly, the use of ODS and FEM eliminate empiric decisions concerning the optimization of machining conditions and increase drastically the tool life. After the ODS and FEM studies, it was possible to optimize the process and work material fixture system and machine more than 30,000 threaded holes without reaching the tool life limit and catastrophic fail.
Resumo:
Sigma phase is a deleterious one which can be formed in duplex stainless steels during heat treatment or welding. Aiming to accompany this transformation, ferrite and sigma percentage and hardness were measured on samples of a UNS S31803 duplex stainless steel submitted to heat treatment. These results were compared to measurements obtained from ultrasound and eddy current techniques, i.e., velocity and impedance, respectively. Additionally, backscattered signals produced by wave propagation were acquired during ultrasonic inspection as well as magnetic Barkhausen noise during magnetic inspection. Both signal types were processed via a combination of detrended-fluctuation analysis (DFA) and principal component analysis (PCA). The techniques used were proven to be sensitive to changes in samples related to sigma phase formation due to heat treatment. Furthermore, there is an advantage using these methods since they are nondestructive. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Micro-tools offer significant promise in a wide range of applications Such as cell Manipulation, microsurgery, and micro/nanotechnology processes. Such special micro-tools consist of multi-flexible structures actuated by two or more piezoceramic devices that must generate output displacements and forces lit different specified points of the domain and at different directions. The micro-tool Structure acts as a mechanical transformer by amplifying and changing the direction of the piezoceramics Output displacements. The design of these micro-tools involves minimization of the coupling among movements generated by various piezoceramics. To obtain enhanced micro-tool performance, the concept of multifunctional and functionally graded materials is extended by, tailoring elastic and piezoelectric properties Of the piezoceramics while simultaneously optimizing the multi-flexible structural configuration using multiphysics topology optimization. The design process considers the influence of piezoceramic property gradation and also its polarization sign. The method is implemented considering continuum material distribution with special interpolation of fictitious densities in the design domain. As examples, designs of a single piezoactuator, an XY nano-positioner actuated by two graded piezoceramics, and a micro-gripper actuated by three graded piezoceramics are considered. The results show that material gradation plays an important role to improve actuator performance, which may also lead to optimal displacements and coupling ratios with reduced amount of piezoelectric material. The present examples are limited to two-dimensional models because many of the applications for Such micro-tools are planar devices. Copyright (c) 2008 John Wiley & Sons, Ltd.
Resumo:
Solid-liquid phase equilibrium modeling of triacylglycerol mixtures is essential for lipids design. Considering the alpha polymorphism and liquid phase as ideal, the Margules 2-suffix excess Gibbs energy model with predictive binary parameter correlations describes the non ideal beta and beta` solid polymorphs. Solving by direct optimization of the Gibbs free energy enables one to predict from a bulk mixture composition the phases composition at a given temperature and thus the SFC curve, the melting profile and the Differential Scanning Calorimetry (DSC) curve that are related to end-user lipid properties. Phase diagram, SFC and DSC curve experimental data are qualitatively and quantitatively well predicted for the binary mixture 1,3-dipalmitoyl-2-oleoyl-sn-glycerol (POP) and 1,2,3-tripalmitoyl-sn-glycerol (PPP), the ternary mixture 1,3-dimyristoyl-2-palmitoyl-sn-glycerol (MPM), 1,2-distearoyl-3-oleoyl-sn-glycerol (SSO) and 1,2,3-trioleoyl-sn-glycerol (OOO), for palm oil and cocoa butter. Then, addition to palm oil of Medium-Long-Medium type structured lipids is evaluated, using caprylic acid as medium chain and long chain fatty acids (EPA-eicosapentaenoic acid, DHA-docosahexaenoic acid, gamma-linolenic-octadecatrienoic acid and AA-arachidonic acid), as sn-2 substitutes. EPA, DHA and AA increase the melting range on both the fusion and crystallization side. gamma-linolenic shifts the melting range upwards. This predictive tool is useful for the pre-screening of lipids matching desired properties set a priori.